Functional Information

Mihai Draganescu defines general information as a meaningful syntactic structure. A. N. Kolmogorov and G. Chaitin define the information content of a binary string as having the magnitude order of the description used to generate it. More precisely, Chaitin associates the amount of information with the size of a program.

Applying the general information's definition in digital systems functional information is defined as the symbolic structure acting by its formal, context dependent meaning.

Functional information develops in a three-stage process inside the digital domain: The structured state of a non-finite automaton (introduced in [Stefan '83]) is the Cartesian product defining the state of a big and simple automaton.

Thus an information system processes data (which can be or can't be information) through the information (the system of program-like symbolic structures) interpreted or executed by the hardware resources of a machine.

Information has a content dependent definition, because it must be defined according to the action controlled using it.

Functional information allows us to segregate the simple (circuits) from the complex (programs) in a computing system (see also 0-State Universal Turing Machine).

Segregating simple from complex in digital system, its actual complexity is minimized.


References

[Stefan '83] Gh. Stefan, I. Draghici, T. Muresan, E. Barbu, Circuite integrate digitale (Digitale Integrated Circuits), Ed. Didactica si pedagogica, Bucuresti, 1983.
[Stefan '91] Gh. Stefan,: Functie si structura in sistemele digitale (Function and Structure in Digital Systems), Ed. Academiei Romane, Bucuresti, 1991.
[Stefan '97] Gh. Stefan: Circuit Complexity, Recursion, Grammars and Information. Multiple Morphisms, Ed. Transilvania University of Brasov, 1997.
[Stefan work in endless progress] Gh. Stefan: Loops & Complexity in Digital Systems. Lecture Notes on Digital Design in the Giga-Gate/Chip Era