From time to time, people complain about the state of the Internet or of the World Wide Web. They are sometimes parts of governments charged with mitigating crime, sometimes privacy advocates, sometimes local governments or retails lamenting loss of tax revenues, sometimes social crusaders charging it with personal isolation, bullying, vice, and other communal maladies.
Certain people have made the pointing out of Web ills their principal theme. Jaron Lanier has long done so, and has written many books on the matter. Cathy O’Neill is a more recent critic, not only of the Web but of businesses which employ it and other data collecting mechanisms to mine imperfectly and abuse their intrinsically imperfect pictures for profit.
Others have underscored the effects of what is predominantly sampling bias. The thing about that is this should be no surprise. What is a surprise is the companies involved don’t see and respond to this as the statistical problem it is. How representative a sample actually is of a population of interest is perhaps the key question in any statistical study. That these companies settle for samples of convenience rather than validated one shows they are practicing very weak data methods, no matter how many people with doctorates are associated with these projects.
There is also the criticism from Professor Lawrence Lessig who understood early the social and legal ramifications of how the Internet and Web are built, particularly in such incisive books as Code and Other Laws of Cyberspace, Code: Version 2.0, and Remix.
In Code: Version 2.0 Lessig continued and reemphasized the warnings issued in Code and Other Laws of Cyberspace that given the way the Internet and Web were technically structured and funded, the idea of a free market of ideas was slipping away and it was becoming more regulable, more subject to influence by a few big groups, and the user-as-product problem with which Powazek, among others, has taken issue.
Perhaps Lessig has come closest to it, but what the audience of these critics should understand is that the shortcomings they articulate are inherently implied by the technical design of the Internet and Web as they are. The Internet and Web at the experiential level are constructed using the Big Ball Of Mud design anti-pattern. Accordingly, as when any ball of wet mud is subjected to sufficiently large outside forces, it deforms, and becomes arbitrary shapes. Given their present size, however, such deformation is having big social implications, whether China’s aggressive censorship, or foreign influences of United States elections, probably from Russia, or sales of private user data, whether wittingly or not, by large Internet presences.
The thing of it is, there were people who thought carefully and long about how such all-connecting networks should operate and devised specific and careful design principles for them. While there were several (see history), one of the least known, particularly today, is Theodor Holm Nelson. “Ted” Nelson conceived of a non-anonymous network of producers and consumers of content whereby, through a technical device he termed transclusion (coined in Literary Machines), readers would make micropayments to reader or otherwise access content produced by others, with the bulk of these going as compensation to producers and some used for managing the network apparatus. This was termed Xanadu and Nelson and colleagues made several attempts to realize it and several technically related and useful ideas.
This is a difficult problem. Such a structure, if it is not to be defeated or subjugated, needs mechanisms for this built into its technical structure, along with a strong authentication (public key cryptography?) built into it to both prevent theft and identify the party both sending and accepting payments and content. The Internet and Web grew up and grow in a combination of deliberate and careful crafting with haphazard, business-driven choices. Just study how companies operating their innards are paid, and how it got that way. Imposing a rigorous design would make growth expensive, slow, and difficult, demanding a large number of readers and consumers before there was anything to read. Accordingly, Xanadu not only didn’t happen, it couldn’t happen.
However, look where the Internet and Web are now? Spam, malicious attacks, election interference, theft of credit card information, identity theft, viruses, cryptocurrency-driven consumption of excess electrical energy, tracking of individuals by their phones, targeted advertising are some of the places we’ve gone.
What’s intriguing to me is the possibility that Ted Nelson was right all along, and the kind of careful design he had in mind for Xanadu may one day become necessary if the Internet and Web are to survive, and not just splinter into a hundred subsidiary networks each controlled by the biggest Local Thug, whether that is a government or telecommunications giant. Nelson himself believes we can still learn many things from Xanadu.
So, in many senses, the Internet and Web did not have to be the way they are. There were other, better ideas. In fact, considering that, and considering what we’re doing to Earth’s climate through our unmitigated worship of Carbon and growth, if humanity ever needs an epitaph, I think it ought to be: