I just discovered the work of Fontana and Buss on Algorithmic Chemistry. The ideas appear to be very close to what I am trying to do with the chemical concrete machine (independently, because of my relative ignorance of past research in the field, but I’m improving my knowledge day by day).
I shall jump directly to differences. At first sight there are a lot of ideas which look the same, mainly that lambda terms are molecules and chemical reactions are related to reduction in lambda calculus.
The main differences are:
- for the chemical concrete machine, molecules are not just any abstract list, but a particular one, expressed as certain graphs,
- I don’t use lambda calculus, but a more powerful version, graphic lambda calculus, which works directly with such “molecules” (i.e. graphs in ), being free from any linear writing convention or variable names,
- chemical reactions between such molecules do not correspond to lambda abstraction or to the application. Instead, the chemical reactions correspond to reduction moves, mainly to the graphic beta move,
- molecules are not functions, because of the lack of extensionality (eta reduction), which is good, because it makes the calculus much more concrete than with extensionality (but this is an argument which needs more details, that’s for later). For now I shall just mention my belief that extensionality and types are an unnecessary relics of thought (controversial, I know) and one of the main stumbling blocks on the path of reconciling syntactic with semantic is keeping types and extensionality (the short argument against these is that there’s nothing like types or extensionality in any biological brain, and probably is just another manifestation of the cartesian disease).
But otherwise my proposal of the chemical concrete machine is clearly in the field of algorithmic chemistry!