The evolution of the web over the past decade mirrors the development of the American economy. All key indicators move up and to the right on the charts, a steady stream of fundamental breakthroughs provides a sense of “progress”, but in reality, the usability and the impact of technology on people stagnates or even regresses.
This crisis is affecting platforms, creators and consumers.
I will try to analyze and diagnose this situation a little. If you just want to read my commonplace, non-professional speech about the need to restart the web, then you can skip this part. The idea is that we can choose a new lightweight (markdown) markup format to replace HTML and CSS, divide the web into web documents and applications, regaining the speed, accessibility and interestingness of the web.
This post uses a pedantic definition of "web". Several times I have already talked about attempts to reinvent the "Internet". Projects such as dat, IPFS and arweave were conceived to reinvent the Internet or its transport and data transport layers. The web is what sits on top of these layers: HTML, CSS, URL, JavaScript, browsers.
Crash of platforms
Last week, there was an important platform change - Mozilla laid off 250 employees and said it would affect Firefox development. The Firefox browser was not the second most popular browser - Safari is, mainly due to the "forced" audience of iPhone and iPad owners. However, it was the most popular browser people chose .
Chart from statcounter
The real winner was not Chrome itself, but the Chrome engine. One KHTML codebase split into WebKit (Safari) and Blink (Chrome, Microsoft Edge, Opera, etc.)
In practice, this is what the textbook definition of "monoculture" looks like. On the one hand, this is a victory in terms of collaboration, because no one needs to “waste time” on competing implementations, and web developers face the same features and bugs in all browsers. But in a deeper sense, it threatens the basic principles of web evolution.
Specializations and implementations
The web has evolved by combining specifications and implementations . Organizations such as the WHATWG , W3C, and the IETF have been spaces for independent developers, corporations, and academics to collaborate to discuss the potential of new web capabilities. Browsers have tested their ideas on many different implementations.
It was an interesting part of the structure: such a system ensured that all of us could develop together, and that one of our goals was to be able to contribute to the web for many contributors. It upset us when on caniuseempty cells appeared, but the general idea was that even if different browsers can be better in different aspects, sooner or later they catch up with each other. Chrome was not the first browser to introduce new features and optimizations.
Working in collaboration is slower than working alone, but it brings benefits that we have lost today. Chrome has evolved extremely rapidly, adding new specifications and ideas at an astonishing rate, becoming one of the most difficult software products to recreate.
It seems to me that Mike Healy put it best :
Don't you think the web is practically "monopolized" in terms of complexity if the rendering engines are capable of creating only one or two organizations for it?
Today, not only is it almost impossible to create a new browser from scratch - if you do, the constant race to implement new standards will require a whole team of specialists. You can read about this in Drew DeWalt's article Web browsers need to stop ; I also recommend reading his other materials.
Challenge for creators
It has become much more difficult for the web to design.
The web has only grown for 25 years, has had very few opportunities to shrink, and today it is under the influence of an extremely short-sighted culture of economic and career growth with no long-term plans. There are many ways to implement something, and some of the most popular ways to create applications on the web are, in my opinion, usually extremely, overly powerful.
The best way to get into web development in 2020 is to pick a niche like Vue.js or React and hope to have a CSS expert on the team.
There is a bewildering array of technologies out there for those who just want to create a web page rather than strive to get into the industry, but the simplest, and probably the best, are stigmatized. People are more likely to write React resumes with GraphQL than they type HTML into Notepad.
A problem for consumers
We hope that all innovation is created for the sake of the user , but often this is not the case. It looks like modern websites are the largest, slowest, and most buggy in the history of the web. Our computers are hardly getting any faster and our internet connection speeds are stagnating (don't even try to say anything about 5G). The growth in the size of web pages is outpacing the growth of all other parameters.
Because of all this, I no longer expect pages to be fast even with uBlock installed in Firefox and a good local fiber provider.
But I don't want to blame these web developers for everything.... I can share a rather funny story from my former place of work. We collected data on user interaction with sites to answer simple questions like "do people click a button to upload files to the server, or do they use drag & drop?" Therefore, we used Segment , a tool that allows you to add data collection pipelines using a simple script. The problem, however, was that Segment had a huge page with hundreds of data providers and ad technology companies. And, of course, those guys who are engaged in business in the company began to click on all these buttons .
You see, the problem with advertising and data tracking is that all of this can be done, but who will refuse it? (In our case, I opted out and added a CSP that blocks new advertisers from accessing the page level.)
A return to simplicity
It is impossible to arrive at a simple system by adding simplicity to a complex system. - Richard O'Keeffe
Where do we go next? The smartest people suggest that we arrange a revision of the web versions .
How do we make the web interesting, collaborative, and good?
First, I thought there were two web sites:
Web documents
There are "web documents": blogs, news, Wikipedia, Twitter, Facebook. As far as I understand, in fact, this is the web as it was originally seen (I was two years old then). CSS, which we now see as a tool that designers can use to create brand uniqueness and add pixel-accurate detail, was originally seen as a way to make documents readable without formatting, allowing readers of those documents to customize their appearance. In fact, this attribute was saved as custom stylesheets in Chrome for a while and still works in Firefox . However, on the web today, this will be a daunting task, as he has effectively abandoned the idea of semantic HTML .
Web "applications"
And then there are "web applications". It started out as server-side applications built on top of something like Django and Ruby on Rails . Before them, there were many technologies that will now live forever in corporations, such as Java servlets .
Backbone.js demonstrated that many of these applications could be ported to the browser, after which React and many of its SPA competitors created a new world order for the web - client applications with a high degree of interactivity and complexity.
War between parts of the web
I argue that it is this dual nature that creates the magic of the web. But it is also a destructive force.
The magic is that a simple blog can be a creative medium, a great interactive way to express yourself. My site isn't like that, but I'm just saying it's possible .
The problem is that "web documents" often suffer from application characteristics — it’s JavaScript and animation, the complexity of which make the average news website a disaster. When document websites adopt application patterns, they often accidentally sacrifice accessibility, speed, and machine readability.
And "web applications" suffer from document characteristics - interactive applications go to great lengths to avoid most of the fundamental characteristics of HTML and CSS, and only use them as raw materials - completely avoiding writing HTML directly, avoiding writing CSS , avoiding standard animation functions , replacing paginate to something that looks similar but works completely differently . Web applications use JSX over HTML and prefer to deal with it in the browser itself, or use Svelte over JavaScript and also prefer it.
When I read the blog posts of “traditional web developers” who are pissed off that HTML and CSS are not enough today and things have gotten so complex, I think this is mainly because in many places the application development stack in building the web -sites replaced the document creation stack. Wherever we would have used Jekyll or site-side rendering, React or Vue.js is now applied. There are advantages to this approach, but for many websites with minimal interactivity, this means abandoning decades of knowledge in exchange for some speed benefits that may not even matter.
The appeal of social media
The appeal of social media is partly due to the fact that it allows us to create documents without thinking about web technology and provides guarantees about speed, accessibility and perfection that would take a lot of our time without social media. We don’t have to worry about Facebook post loading quickly on friends' phones or editing and posting a photo on Instagram correctly - all of that has been taken care of for us.
To some extent, this doesn't require social media capabilities: standards like RSS and services like Instapaper demonstrate that beautiful formatting and distribution can be done at the platform level. and build on top of existing vanilla websites.
But there is no clear division
, - : ! , , -, , , («» JavaScript -), , . , , . : , , , .
Web documents 2.0
Of course, it would be great to realize a unified theory of the new web that has enough application characteristics and enough document characteristics to create all of the hybrid interactive documents that we are working with today. But the path to the split web is clearer for us, and I thought about it first, so let's talk a little about it.
- Rule # 1 - don't create subsets . If the replacement of the web is only the features that were present in Firefox 10 ten years ago, then no one will like this version.
- №2 — . , , , - , .
- №3 — . , : , , , , , , .
So, let's say we are creating a new web document.
First, we need a minimal, standardized markup language to communicate documents. We'll probably want to start with a lightweight markup language that will be tailored for HTML generation. A strict flavor of Markdown called Commonmark seems like a pretty good choice . This is the language in which I have written all my posts, the most popular in my family. There are many great parsers for Markdown and a large ecosystem of tools.
Next, we need a browser. For a long time, Mozilla has been working on a brand new browser - Servo... The development team was fired last week, which is sad. This project includes independent Rust frameworks for rendering fonts , as well as a high-end Markdown implementation in Rust and an ever-growing set of awesome application frameworks . Is it possible to create a pure Markdown browser that uses this pipeline directly? May be?
I believe that such a combination will allow us to recover the lost speed to a large extent. We could get a page to the screen in a fraction of the time compared to the modern web. Memory consumption can be tiny. The default system will be incredibly accessible. You can create great looking standard style sheets and share alternative custom style sheets. Thanks to the significantly reduced volume, we will be able to port the system to all kinds of devices.
What will the website editing tools look like (which is probably the most important)? They can be much simpler.
What would aggregation look like? If web pages were more like documents than applications, then we wouldn't need RSS — websites would have an index pointing to documents and a “reader” could aggregate the web pages themselves by default.
We could connect the two web using something like a well-known dat protocol file, or we could use the Accept header to create a browser that understands HTML but prefers lightweight pages.
Web Applications 2.0
I have a feeling that whatever web problem I mention, I will be automatically answered that WebAssembly can fix it. Could it be so?
I do not know. WebAssembly is actually a great thing, but should web apps just render on canvas and each app draw its own graphical toolkit? Do we really need differences in the implementation of anti-aliasing in web applications? Containerized applications do exist, take a look at Qubes , but they are not really what users should be aiming for. Anyone who has used Blender or Inkscape on a Mac has a rough idea of ​​what it would look like.
Or could WebAssembly become the new "core" and still render the UI HTML? Or ... we can create a shared linked library that WebAssembly applications use. It would work somewhat like SwiftUI and provide application-friendly standards like constraints rather than concepts like line height and floats found in documents.
The problem with shaping the concept of web applications is that it is growing a lot.
The worse the Mac App Store, Windows App Store, App Store and Play Store get, the more these monopolies demand, the more costs it takes to be a Mac or Windows developer, the more these applications are moving to the web. Certainly some apps are betteron the web. But many go there simply because it is the only remaining place where the product can be easily, cheaply and freely distributed or sold.
Once upon a time, we could install applications, give explicit consent for them to run on a computer, and use our hardware. That time is coming to an end, and web pages today have quite sophisticated ways of obtaining any information, from webcams, files, game controllers, audio synthesis to cryptography and everything that was once the realm of possibilities
.exe
and .app
. Of course, this gives new power, but the situation is rather unusual.
Who is working on this?
- Beaker Browser — dat , .
- Project Gemini — -. ( .)
- taizen — . , .
?
There are many possible views on this problem and ways to solve it. I believe this is actually a problem (for everyone but Google). The idea of ​​a web browser as something that we can understand , web pages as something that more people can create , seems amazing to me.
The markdown approach seems very realistic. I think the strongest argument against it is that it "sucks all the interesting stuff out of the web," and that's partly true. However, the early web was not interesting in our usual sense - there we could not create art or use it for anything other than the exchange of documents. But it was incredibly interesting, because it is interesting to share information, and there it could be done in simple and universal ways. Therefore, the most important thing is to find the elements that release the possibilities of such a plan, if, of course, they exist. Or find another plan that "will be interesting enough."
Social media tends to be more restrictive than web pages, but also more engaging.for many important reasons, the most important of which is the possibility of participation by many more people. What if the rest of the web had such simplicity and immediacy without being so centralized? What if we could start over?
Advertising
Epic servers are virtual servers for hosting sites from a small WordPress blog to serious projects and portals with a million audience. A wide range of tariff plans is available, the maximum configuration is 128 CPU cores, 512 GB RAM, 4000 GB NVMe!