The WWW is an Internet system, based on the following ingredients:
- web pages (written in html)
- a (web) browser
- a web server (because of the choice of client-server architecture)
Tim Berners-Lee wrote those programs. Then the WWW appeared and exploded.
The force behind this explosion comes from the separation of the system into independent parts. Anybody can write a web page, anybody who has the browser program can navigate the web, anybody who wants to make a web server needs basically nothing more than the program for that (and the previously existing infrastructure).
In principle it works because of the lack of control over the structure and functioning.
It works because of the separation of form from content, among other clever separations.
It is so successful, it is under our noses, but apparently very few people think about the applications of the WWW ideas in other parts of the culture.
Separation of form from content means that you have to acknowledge that meaning is not what rules the world. Semantics has only only a local, very fragile existence, you can’t go too far if you build on semantics.
Leave the meaning to the user, let the web client build his meaning from the web pages he can access via his browser. He can access and get the info because the meaning has been separated from the form.
How about another Net service, like the WWW, but which does something different, which goes to the roots of computation?
It would need:
- artificial molecules instead of web pages; these are files written in a fictional language called “Mol”
- a gui for the chemlambda artificial chemistry, instead of a web browser; one should think about it as a Mol compiler & gui,
- a chemical server which makes chemical soups, or broths, leaving the reduction algorithm to the users;
This Mol language is an idea which holds some potential, but which needs a lot of pondering. Because the “language” idea has bad effects on computation.