New Book: Understanding Deep Learning
- Vincent Granville
- November 16, 2023
By Simon Prince, computer science Professor at the University of Alberta. To be published by MIT Press, Dec 2023. The author shares the associated Jupyter notebooks on his website, here. Very popular, it got over 5,000 likes when the author announced the upcoming book on LinkedIn. I pre-ordered my copy. Summary An authoritative, accessible, and […]
Read MoreNoGAN: Ultrafast Data Synthesizer – My Talk at ODSC San Francisco
- Vincent Granville
- November 16, 2023
My talk at the ODSC Conference, San Francisco, October 2023. Includes Notebook demonstration, using our open-source Python libraries. View or download the PowerPoint presentation, here. I discuss NoGAN, an alternative to standard tabular data synthetization. It runs 1000x faster than GAN, consistently delivering better results according to the most sophisticated evaluation metric, implemented here for […]
Read MoreQuantum Derivatives, GenAI, and the Riemann Hypothesis
- Vincent Granville
- November 12, 2023
Have you ever encountered a function or cumulative probability distribution (CDF) that is nowhere differentiable, yet continuous everywhere? Some are featured in this article. For a CDF, it means that it does not have a probability density function (PDF), and for a standard function, it has no derivative. At least, not until now. The quantum […]
Read MoreNumber Theory: Longest Runs of Zeros in Binary Digits of Square Root of 2
- Vincent Granville
- October 27, 2023
Studying the longest head runs in coin tossing has a very long history, starting in gaming and probability theory. Today, it has applications in cryptography and insurance. For random sequences or Bernoulli trials, the associated statistical properties and distributions have been studied in detail, even when the proportions of zero and one are different. Yet, […]
Read MoreNew Book: Statistical Optimization for Generative AI and Machine Learning
- Vincent Granville
- October 7, 2023
With case studies, Python code, new open source libraries, and applications of the GenAI game-changer technology known as NoGAN (194 pages). This book covers optimization techniques pertaining to machine learning and generative AI, with an emphasis on producing better synthetic data with faster methods, some not even involving neural networks. NoGAN for tabular data is […]
Read MoreNoGAN: New Generation of Synthetic Data (Video)
- Vincent Granville
- September 28, 2023
My talk at the Generative AI Conference, London, September 2023. View or download the PowerPoint presentation, here. I introduce a new, NoGAN alternative to standard tabular data synthetization. It is designed to run faster by several orders of magnitude, compared to training generative adversarial networks (GAN). In addition, the quality of the generated data is […]
Read MoreGenAI: Fast Data Synthetization with Distribution-free Hierarchical Bayesian Models
- Vincent Granville
- September 22, 2023
Deep learning models such as generative adversarial networks (GAN) require a lot of computing power, and are thus expensive. Also, they may not convergence. What if you could produce better data synthetizations, in a fraction of the time, with explainable AI and substantial cost savings? This is what Hierarchical Deep Resampling was designed for. It […]
Read MoreNew Python Library to Evaluate AI-generated Data and Compare Models
- Vincent Granville
- September 19, 2023
Called GenAI-Evalution, you use it for instance to assess the quality of tabular synthetic data. In this case, it measures how faithfully the synthetization mimics the real data it is derived from, by comparing the full joint empirical distributions (ECDF) attached to the two datasets. It works both with categorical and numerical features, and returns […]
Read MoreHow to Fix a Failing Generative Adversarial Network
- Vincent Granville
- August 12, 2023
In this article, I explore different front-end strategies to improve a generative adversarial network (GAN) that leads to poor synthetization, in the context of tabular data generation. It is well known that tabular data is a lot more challenging than images, when using deep neural networks for synthetization purposes. An algorithm may work very well […]
Read MoreGenerative AI Technology Break-through: Spectacular Performance of New Synthesizer
- Vincent Granville
- August 2, 2023
Neural network methods have overshadowed all other techniques in the last decade, to the point that alternatives are simply ignored. And for good reasons: techniques such as generative adversarial networks (GAN) proved very successful in some contexts, especially computer vision. Indeed, there has been several attempts to turn every problem and traditional method — regression, […]
Read More
You must be logged in to post a comment.