The Massachusetts Technology Leadership Council announces it is awarding Professor Sir Tim Berners-Lee the 2017 Commonwealth Award.
This man has so many awards and honours that there is an entire Wikipedia page devoted to the list. As if a name prefixed with “Professor” and “Sir” wasn’t enough, his name is suffixed with the following letters “OM KBE FRS FREng FRSA FBCS”. Included in his accolades are everything except a Nobel Prize.
He’s a fellow of the Royal Society (along with Newton and Darwin, etc), he won the A.M. Turing Prize, which is basically the Nobel Prize for computing, and he was appointed to the Order of Merit which is personally chosen by The Queen (unlike the knighthood which involves input from Downing Street) and only has 24 living members.
He has won all of those awards, and 46 or so others, because he invented the World Wide Web, which involved inventing a universal system for recognising the location of web pages (Uniform Resource Locator, URL), a method for publishing webpages (Hypertext Markup Language, HTML), and a protocol for serving up webpages on request (Hypertext transfer protocol, HTTP). All of which, you’ll probably have spotted, are still in use today; although he later apologised for the “//” with the eternal reasoning “it seemed like a good idea at the time.”
Berners-Lee graduated from Oxford with a degree in physics and, after a brief spell in a telecommunications company, went to CERN (Conseil Européen pour la Recherche Nucléaire) in Switzerland as a contractor. He invented all of that stuff we still use, including the slashes, while he was there – the internet was in regular use, but strictly speaking the internet only refers to one computer communicating with another. The World Wide Web is the series of servers, web pages, and general access; it’s what we’re talking about when we refer to “the internet” in common parlance.
The problem he came across was that CERN hosted well over 10,000 researchers and scientists from around Europe and they were all trying to collaborate on different projects. People had different systems, hardware, file types and sizes. It was a nightmare, Berners-Lee went on to say his creation was “really an act of desperation”.
“He’s a splendid and very intelligent man,” said John Poole, who employed Berners-Lee at his company, Image Computer Systems, in the late 1970s, in a profile of Berners-Lee in The Guardian. He is reportedly quiet and determined, and his enthusiasm is delivered in a rapid-fire conversational style. Marc Andreesen, who worked with Berners-Lee before creating the Netscape browser, also told The Guardian “Only smart people could use the internet, was the theory, so we needed to keep it hard to use. We fundamentally disagreed with that: we thought it should be easy to use.”
Berners-Lee created an initial hypertext program in the early 80s which could be used to allow for easier collaboration. When he returned in 1984 he noticed the system wasn’t up to scratch and most of the time spent administering it involved keeping it up to date. Why, he thought, can’t we have a system where everyone can access and edit the same information?
The key change by 1984 was that CERN had introduced a series of updates to its systems which had made it the biggest internet node in Europe. This set the stage, and Berners-Lee worked on his new proposal which he submitted in 1989 and didn’t receive much attention, but his boss urged him on. By 1990 he had created the first web server (http://info.cern.ch/) and the first web page (http://info.cern.ch/hypertext/WWW/TheProject.html) – both of which have changed since but retain their HTML-only charm.
In January 1991 the whole thing was switched on. By January 1993 there were fifty web servers across the world and by October 1993 there were five hundred.
One of the other awards Berners-Lee supposedly received this year was the Obedience Award, “recognizing his work to help wealthy corporations add DRM (Digital Restrictions Management) to official Web standards.” One suspects his endless fight with corporations, politicians and others involves a distasteful level of realpolitik, and thus some compromises. I get the impression it’d be unfair to interpret this otherwise, given that the trend of his life has been generally opposed to such stances.
By all other accounts Berners-Lee wasn’t just the mind who saw the technical possibilities, he was the mind who saw the social possibilities. “The web is more a social creation than a technical one. I designed it for a social effect — to help people work together — and not as a technical toy,” he said in Weaving the Web.
Many consider his real genius to lie in seeing this potential. Instead of setting up his own browser company and potentially starting a war between protocols he chose not to patent any part of what he had created – he released it all free, without royalties or restrictions.
When he was listed in TIME’s 100 most influential people of the 20th century they wrote, “The World Wide Web is Berners-Lee’s alone. He designed it. He loosed it on the world. And he more than anyone else has fought to keep it open, nonproprietary and free.”
Off the back of this he’s deeply concerned about the way the internet is evolving, a key proponent of net-neutrality, only a few months ago he published what he thinks are the biggest threats to his original idea. Data privacy is one, the others are false information and political advertising both of which spring from the hold a handful of social media companies have on the market.
You can see his point, when you look at how democratised Wikipedia is vs how monetised social media is… which would you trust more as a source of information?
He’s still deeply involved in the whole project today, running W3C. “He can be a bit of an autocrat – if Tim doesn’t like something, it doesn’t get in – but he’s passionate about openness and freedom,” professor Wendy Hall head of the school of electronics and computer science at Southampton University, of which Sir Tim is chairman, said in The Guardian’s profile.
And the name he gave it, which we brush over because it’s so common, includes the phrase World Wide. His vision for the internet was summed up when he appeared during the 2012 Olympic opening ceremony. He appeared on one of the original computers he used to build the World Wide Web and tweeted live “This is for everyone.”
Even right at the beginning, when Berners-Lee wrote his original program in the early 1980s which formed the basis of his final invention – and despite the fact said invention was to initially be used by a bunch of nuclear physicists – he called it ENQUIRE, a name he took from the 1856 book Enquire Within Upon Everything. The book contains the following introduction from its editor:
Whether You Wish to Model a Flower in Wax;
to Study the Rules of Etiquette;
to Serve a Relish for Breakfast or Supper;
to Plan a Dinner for a Large Party or a Small One;
to Cure a Headache;
to Make a Will;
to Get Married;
to Bury a Relative;
Whatever You May Wish to Do, Make, or to Enjoy,
Provided Your Desire has Relation to the Necessities of Domestic Life,
I Hope You will not Fail to ‘Enquire Within.’
Every now and then, I stumble upon posts such as these here and there: And,…
Winter(Physics) is Coming It now looks like Large Language Models running on the GPT technology…
Latin America’s tech industry is booming, with innovative new startups popping up across the region.…
The Global Initiative for Information Integrity on Climate Change claims to 'safeguard those reporting on…
In the late 19th Century, physicians began inserting hollow tubes equipped with small lights into…
This year wasn’t exactly what the video gaming industry expected — it declined by 7%…