For centuries, mathematicians worked on problems where the convergence is either smooth or does not happen. Now the concept of chaotic convergence is mainstream, popularized by the stochastic gradient descent in deep neural networks, central to LLMs. In my most recent book here, I discuss many cases involving various types of chaotic convergence. In some examples, the chaos exhibits a fractal structure. In others, patterns are strikingly similar to quantum dynamics.
This article — a new addition to the book in question — features more illustrations falling in various categories, including quasi-periodicity. I also discuss how I used AI to solve some of the challenging problems: I share my experience, mostly negative. For chaotic convergence in the context of explainable deep neural networks, see my other book on no-Blackbox LLM architectures here featuring examples and illustrations related to stochastic gradient descent, swarm optimization, and adaptive loss function.

There are a few beautiful mathematical results in this new article. For instance, the fact that |x_n|^{1/n} attached to the recursion x_{n+3} =2x_{n+2}-16 x_{n+1}+4x_n, converges to
The convergence follows the multi-branch smooth regime pictured on the top left corner in Figure 1. Despite the appearances, there is only one curve, with values jumping from one level to another with a different color, at each iteration.

The recursion x_{n+2} = |x_{n+1} - 3 x_n| is a tough nut, with convergence following the chaotic multi-branch regime pictured on the top right corner in Figure 1. Then, the digit sum function (or Hamming weights) has a proportion of 1 converging to 50% according to the fractal regime pictured in Figure 2.
Get your copy of the free book and the new article
This new article — the technical paper — has been added to the book, as Appendix B. It comes with detailed technical explanations and more results, along with references, illustrations and Python code. The links in the PDF are clickable only in the full (free) book.
The book on no-Blackbox AI architectures with more examples of chaotic convergence, is available here.
To no miss future articles, subscribe to my AI newsletter, here.
About the Author

Vincent Granville is a pioneering GenAI scientist, co-founder at BondingAI.io, the LLM 2.0 platform for hallucination-free, secure, in-house, lightning-fast Enterprise AI at scale with zero weight and no GPU. He is also author (Elsevier, Wiley), publisher, and successful entrepreneur with multi-million-dollar exit. Vincent’s past corporate experience includes Visa, Wells Fargo, eBay, NBC, Microsoft, and CNET. He completed a post-doc in computational statistics at University of Cambridge.