Any two-dimensional screen can be viewed as a collection of rectangular cells. A point and even the entire screen can be defined as such.

Almost all cell phones and computer screens are two-dimensional screens. And some may be three-dimensional.

Therefore anything that runs on mobiles or PCs or MACs is seen on a two-dimensional screen as a collection of rectangular cells, or three-dimensional as rectangular cubes.

In conclusion, anything that runs on mobiles or PCs or MACs can be conceived as a spreadsheet both practically and theoretically.

Therefore, and logically, there is a function that transforms a finite set of bits in memory into each 2D or 3D bit that appears on the screen, this function being the result of successively applying multiple functions on the initial set of bits or on the union of other sets of bits that are added to the starting domain of each of the functions.

The above is obvious, everything is binary. Having a mind capable of "encompassing" all these binaries leads to higher-order quantitative implications, such as a sentence from a programming language, which serves as a qualitative model for solving problems.

Everything, but Every current program can be taken to the Turing binary !!.

But our mind is qualitative, come on, analog, continuous, not binary!

Hence the distance that exists between the so-called "Mathematical Science of Computing" and all current programs, even those called "Artificial Intelligence", those that cannot transgress Turing's laws ... and that are far, far away , to "operate" like the human brain, that is, like human neurons. Golgi did not beat Cajal!

Octavio Báez Hidalgo.