How to Build and Optimize High-Performance Deep Neural Networks from Scratch
With explainable AI, intuitive parameters easy to fine-tune, versatile, robust, fast to train, without any library other than Numpy. In short, you have full control over all components, allowing for deep customization, and much fewer parameters than in standard architectures. Introduction I explore deep neural networks (DNNs) starting from the foundations, introducing a new type […]
Read More
Watermarking and Forensics for AI Models, Data, and Deep Neural Networks
In my previous paper posted here, I explained how I built a new class of non-standard deep neural networks, with various case studies based on synthetic data and open-source code, covering problems such as noise filtering, high-dimensional curve fitting, and predictive analytics. One of the models featured a promising universal function able to represent any […]
Read More
How Synthetic Primes Reveal the Quantum States in the Riemann Hypothesis
This research paper showcases spectacular discoveries across multiple disciplines. The main question — yet unanswered — is how the mathematical engineering behind the scenes could be applied to modern AI, deep neural networks (DNNs) and LLMs in particular, to dramatically accelerate the convergence of some slow algorithms. Most notoriously, the laborious and expensive training attached […]
Read More
10 Tips to Boost Performance of your AI Models
These model enhancements techniques apply to deep neural networks (DNNs) used in AI. The focus is on the core engine that powers all DNNs: gradient descent, layering and loss function. Reparameterization — Typically, in DNNs, many different parameter sets lead to the same optimum: loss minimization. DNN models are non-identifiable. This redundancy is a strength that […]
Read More
A New Type of Non-Standard High Performance DNN with Remarkable Stability
I explore deep neural networks (DNNs) starting from the foundations, introducing a new type of architecture, as much different from machine learning than it is from traditional AI. The original adaptive loss function introduced here for the first time, leads to spectacular performance improvements via a mechanism called equalization. To accurately approximate any response, rather […]
Read More
New Book: 0 and 1 – From Elemental Math to Quantum AI
The book is available on our E-store, here. It all started with the number 1. This e-book offers a trip deep into the most elusive and fascinating multi-century old conjecture in number theory: are the binary digits of the fundamental math constants evenly distributed? No one even knows if the proportions of ‘0’ and ‘1’ […]
Read More
Doing Better with Less: LLM 2.0 for Enterprise
Standard LLMs are trained to predict the next tokens or missing tokens. It requires deep neural networks (DNN) with billions or even trillions of tokens, as highlighted by Jensen Huang, CEO of Nvidia, in his keynote talk at the GTC conference earlier this year. Yet, 10 trillion tokens cover all possible string combinations; the vast […]
Read More
10 Must-Read Articles and Books About Next-Gen AI in 2025
You could call it the best kept secret for professionals and experts in AI, as you won’t find these books and articles in traditional outlets. Yet, they are read by far more people than documents posted on ArXiv or published in scientific journals, so not really a secret. Actually, one of these books is also […]
Read More
Universal Dataset to Test, Enhance and Benchmark AI Algorithms
This scientific research has three components. First, my most recent advances towards solving one of the most famous, multi-century old conjectures in number theory. One that kids in elementary school can understand, yet incredibly hard to prove. At the very core, it is about the spectacular quantum dynamics of the digit sum function. Then, I […]
Read More
Synthesizing Multi-Table Databases: Model Evaluation & Vendor Comparison
Synthesizing multi-table tabular data presents its own challenges, compared to single-table. When the database contains date columns such as transaction or admission date, a frequent occurrence in real-world datasets, generating high quality synthetizations and model evaluation are even more complicated. In this article, we focus on this type of problems, comparing generated observations produced by […]
Read More
You must be logged in to post a comment.