A human readable computer language to enable advanced, transparent, interactive AI based on causal inferencing.
Proteus is a language for expressing the meaning of information. Any information: Information carried by light into a camera, information flowing in a brain. It can even represent the meanings of human languages.
It turns out that enough information is transferred from objects that physically interact that Proteus can be used to model things in the world. Things like atoms, cells, bikes and stars. And Proteus can model extremely complex things like polital systems and biological systems.
But Proteus is more than just a modeling tool. With a comprehensive collection of models, Proteus can be used to store complex knowledge, filter propaganda, validate theories and protect truth, no matter how complex or multifaceted it may be.
Instead of the pattern-based reasoning of current AI, Proteus enables "Structure-based reasoning".
Currently, from AI that processes images to AI for Natural Lanugage, pattern matching is the key that makes it work. This is true for early AI's such as Eliza as well as GOFAI, expert systems and CYC.
The problem is that the number of patterns needed to make a more useful AI can rise exponentially with the amount of knowledge the AI must reason over. And an enormous amount of patterns are required.
This problem is partly solved with the introduction of deep learning neural networks such as Transformers used in Google's BERT and the GPT AIs. These automate the generation of patterns and store them in large matrixes. But the number of patterns required is still a problem. As we want more and more knowledge stored and more and more queries correctly answered by our AIs, it will take more and more power to train them.
There is an easier way. A relatively small amount of structural information can be used to generate an unbounded number of patterns. Thus, not all combinations of pattern must be trained into the network in order to get a complete set.
One might think that the way to proceed is to use Structural information to generate many many patterns and then run those on a GPU. But there is a different algorithm that goes directly from the Structural information to a result. This method is so efficient that in many cases (for example, using Natural Language), a CPU will be able to do as well as a GPU running pattern matches.
Very briefly, the algorithm to process structural information such as that represented by the Proteus language is similar to college algebra (as opposed to abstract algebra) but using "pieces of information" or, as Keith Devlin calls them, "infons" instead of numbers. Of course numbers are one form of information and thus the algorithm can also handle mathematical queries.