My disappointment in the software

My disappointment in the software
 

The essence of software development
 
"We need to make 500 holes in the wall, so I designed an automatic drill." It uses elegant precision gears for continuous speed and torque control as needed.
 
- Excellent, she has the ideal weight. Load 500 of these drills into the cannon and shoot into the wall.

 
 
I've been programming for 15 years. But recently, when developing, it's not customary to think about efficiency, simplicity and perfection: right up to the fact that I'm sad for my career and for the IT industry as a whole.
 
 
For example, modern cars operate, say, 98% of what the current engine design allows physically. Modern architecture uses an accurately calculated amount of material to fulfill its function and remain safe in these conditions. All planes agreed to the optimal size /shape /load and basically look the same.
 
 
Only in software is normal, if the program runs at 1% or even ???% of the possible performance. No one seems to have an objection. @tveastman : Every day I run the program in Python, it runs in 1.5 seconds. I spent six hours and copied it to Rust, now it runs in ??? seconds. This acceleration means that my time will pay off in 41 years, 24 days :-)
 
You probably heard this mantra: "The programmer's time is more expensive than the time of the computer". This means that we spend computer time on an unprecedented scale. Would you buy a car with a consumption of 100 liters per 100 kilometers? How about 1000 liters? With computers, this happens all the time.
 
 

Everything is unbearably slow


 
Look around: portable computers are thousands of times more powerful than those that led a person to the moon. However, every second site can not provide a smooth scrolling of the page at 60 FPS on the latest top MacBook Pro. I can comfortably play games, watch 4K videos, but do not scroll through web pages! This is normal?
 
 
The Google Inbox mail application in the Chrome browser from the same Google, It takes 13 seconds to open a medium sized letter :
 
 
This is, in real time, how long it takes for Google. Not the shortest one, but still, it's just text and pictures! Go Web Stack go! pic.twitter.com/CvqsFiIUuc
- Nikita (@nikitonsky) February 2? 2018

 
It still animates empty white forms instead of showing their contents, because this is the only way to animate something on a web page with decent performance. No, not 60 FPS, but rather "as fast as possible on this page." I'm really looking forward to what the web community will offer when 120 Hz displays become mainstream. They can hardly cope with 60 Hz.
 
 
Update Windows 10 takes 30 minutes . What can you do for so long? This time is enough to fully format my SSD-drive, load a fresh build and install it about 5 times in a row.
 
 
 
 
Pavel Fatin : Typing in the editor is a relatively simple process, so even 286 could provide a fairly smooth typing process.
 
In modern text editors, the delay in typing is greater than in the 42-year-old Emacs. Text editors! What could be easier? For each keystroke, you just need to update the tiny rectangular area on the screen, and modern text editors can not do it in 16 ms. And this is a lot of time. A LOT OF. 3D game fills the screen with hundreds of thousands (!!!) of polygons for the same 16 ms, and also processes input, recounts the world and dynamically loads /unloads resources. How so?
 
 
The trend is that the software does not become faster and more functional at all. We get faster hardware, on which the software with the same functions toggles more slowly than before. Everything works much slower than the maximum speed. Ever wonder why your phone boots from 30 to 60 seconds? Why can not it boot up, say, in one second? There are no physical limitations. Personally, I would like this. I want developers to reach the limit, using every bit for performance.
 
 

All is HUGE


 
And this blowing up. Web applications can open up to ten times faster if you just block ads. Google begs everyone to stop the brakes with the help of the AMP initiative - a technical solution that does not need any technology, just a little common sense. If you remove the inflation, the Internet will run at a crazy speed. Is it really hard to understand?
 
 
Android system without applications takes almost 6 GB . Just think for a second how indecent this number is. What is there, films in HD-quality? I think, basically the code: the kernel, the drivers. Still some resources, of course, but they can not be so big. How many drivers do you need for the phone?
 
 
 
 
Windows 95 was 30 MB. Today we have web pages heavier than this OS! Windows 10 is already 4 GB, that is 133 times more. But is it 133 times better? I mean, they are practically the same functionally. Yes, we have Cortana, but I doubt that it weighs ?970 MB. But this is Windows 1? does Android have to be one and a half times more?
 
 
The Google Keyboard application is eating 150 MB as if nothing had happened. This program draws 30 keys on the screen - it's true five times harder than all of Windows 95? The Google app, basically just a package for Google Web Search, takes 350 MB! Google Play services that I do not use (I do not buy books, music or videos) - 300 MB, which just sit here and can not be deleted.
 
 
 
 
After installing all the necessary applications (social networks, chats, maps, taxis, banks, etc.) on the phone there was only 1 gigabyte for photos. And this is generally no games and music! Remember the times when the OS, applications and all your data were placed on a floppy disk?
 
 
Your program for notes is probably written in Electron and, thus, Comes with a driver for the Xbox 360 controller. , can show 3D graphics, play audio and take pictures with a webcam.
 
 
 
 
A simple text chat was always famous for speed and low memory consumption. So Slack is an example of a very resource-intensive application. I mean that chat and text editor are the most basic things, they need to consume the least amount of resources. Welcome to 2018.
 
 
You can say that they at least work. But increasing size does not mean improvement. This means that someone has lost control. We no longer know what is happening. The increase in size is an increase in complexity, a decrease in productivity and reliability. This is abnormal and should not be considered a norm. At bloated size, you should immediately pay attention - and stay away from them.
 
 

Everything is rotting


 
Android-phone on 16 GB was beautiful three years ago. Today under Android 8.1 it barely works, because each application has increased at least twice for no apparent reason. There are no additional functions. They did not become faster and appearance did not change. Did they just swell up?
 
 
iPhone 4s came out with iOS ? but can hardly work under iOS 9. And this is not because iOS 9 is much better - basically, the system has not changed. But the new equipment is faster, so they made the software slower. Do not worry - you've got exciting new features, for example running the same applications at the same speed! I do not know.
 
 
iOS 11 has stopped supporting 32-bit applications. This means that if the developer is not ready to go back and update the application, chances are you will not see this great program again.
 
 
@jckarter : The DOS program can be made to work without changes on almost any computer made after the 80's. The jаvascript application may stop working because of tomorrow's Chrome update.
 
Today's web pages are will not work in any browser after 10 years (and maybe even earlier).
 
 
"We need to run from the bottom of our legs to just stay in the same place." But the point? I can constantly buy new phones and laptops, like everyone else, but only for the sake of having to be able to run all the same applications that have only become slower?
 
 
I think that we can and must correct the situation. Now everyone is developing programs for today, occasionally for tomorrow. But it will be nice to do things that work a little longer.
 
 

Worse - it means better


 
Now no one understands anything. And he does not want to understand. We just release half-baked stuff, hope for the best and call it "common sense for a startup".
 
 
Web pages are asked to be updated if something went wrong. Who has the time to find the cause of the problem?
 
 
 
 
Any web application produces a constant stream of "random" JS errors, even on compatible browsers.
 
 
The entire architecture of the Web /SQL databases is built on the premise (even hope) that no one will change the data while you are looking at an open web page.
 
 
Most of the collaborative applications did "as best they could", there are a lot of typical scenarios when they lose data. We saw the dialogue "What version should I save?" Today the bar is so low that users are happy even with this issue.
 
 
 
 
And no, my application is not normal in my world, which says: "I will destroy part of your work, just choose which one."
 
 
Linux intentionally kills random processes. And yet this is the most popular server OS.
 
 
Every device on my system regularly fails in one way or another. From time to time, the Dell monitor needs to be rebooted by hardware, because there is software in it. AirDrop? You're lucky if he finds the device, otherwise what to do? Bluetooth? The specifications are so complicated that The devices will not communicate with each other , and periodic reboots - the best option is .
 
 
 
 
And I do not even mention Internet of things . It is so beyond the reasonable that there is nothing to add.
 
 
I want to be proud of my work. I want to do workers, stable things. For this we need to understand what exactly we are developing, inside and out, and this is impossible to do in bloated, overly complicated systems.
 
 

In programming, the same chaos


 
It seems that no one is interested in quality, fast, effective, durable, thorough solutions. Even if effective solutions are known for a long time, we still struggle with the same problems: package management, build systems, compilers, language design, IDE.
 
 
Assembly systems are inherently unreliable and periodically require complete cleaning, although they have all the information for disability. Nothing prevents to make the build process reliable, predictable and 100% reproducible. It's just that no one thinks it's important. NPM has been in the "sometimes working" state for many years.
 
 
@przemyslawdabek: It seems that rm-rf node_modules is an integral part of the workflow in Node.js /jаvascript projects.
 
What about assembly time? No one considers the problem that the compiler works for minutes or even hours. But how about "the programmer's time is more expensive"? Almost all compilers, pre- and postprocessors significantly, sometimes catastrophically increase the build time, without providing proportionally significant advantages.
 
 

 
 
You expect that programmers will make basically rational decisions, but sometimes they do exactly the opposite. For example, choosing Hadoop
even if it is slower than performing the same task on one desktop computer
.
 
 
Machine learning and AI dropped the software to guessing on the coffee grounds at a time when most computers were not even reliable enough.
 
 
@ rakhim : When an application or service says "under AI" or "machine learning", I read it as "unreliable, unpredictable behavior that can not be explained." I stay away from "AI", because I want from computers of the opposite: reliability, predictability and logic.
 
We shoved the virtual machines in Linux, and then shoved the Docker into virtual machines, simply because no one could deal with the mess that most programs, languages ​​and their environments produce. We cover the shit with blankets, so as not to remove it. For example, a "single binaries" remains a HUGE advantage of Go. No mess == success.
 
 

 

The environment of Python was so polluted that my laptop was declared an environmental disaster zone.
 
Note. Environmental Protection Agency Python wantedo fill it with cement and bury it with a picture at the entrance - a warning for future civilizations about the danger of using sudo to install random packets

 
 
And dependencies? People thoughtlessly put over-packaged "full packets" for the simplest problems, without thinking about the consequences. Of these dependences, new ones are growing. In the end, you get a tree that is something middle between the horror movie (huge and full of conflicts) and comedy (there are no reasons why we added these packages here,
) But here they are
):
 
 

 
 
Programs can not run for several years without rebooting. Sometimes
even a few days is too
. There are random glitches, and no one knows why.
 
 
What's worse, no one has time to stop and find out what happened. Why bother if there is always another way out. Raise the new AWS instance. Restart the process. Delete and restore the database. Write a script that will restart your broken application every 20 minutes. Include the same resources several times: tyap-lyap - and in the production of . Move fast, do not waste time correcting mistakes.
 
 
This is not engineering work. It's just lazy programming. Engineering work involves a deep understanding of the performance, structure and limitations of what you create. To create a hack of low-quality material is quite the opposite. To develop, we must understand what and why we are doing.
 
 

We are stuck with


 
Thus, all this is just a bunch of barely working code added over the previously written barely working code. It continues to grow in size and complexity, reducing the chances of change.
 
 
To have a healthy ecosystem, need return. You need sometimes throw away the trash and replace it with the best alternatives.
 
 
 
 
But who has time for this? New OS kernels did not go out for as long as 25 years? This has now become too complicated to simply take and rewrite. In browsers, there are so many border situations and historical precedents that no one dares to write the engine from scratch.
 
 
Today's definition of progress - or throw fuel:
 
 
@ sahrizv : 2014 - it is necessary to introduce microservices to solve problems with monoliths.
 
2016 - you need to implement Docker to solve problems with microservices.
 
2018 - you need to implement Kubernetes to solve problems with Docker.
 
or reinvent the wheel:
 
 
@ dr_c0d3 : 2000: Write 100 lines of XML to "declaratively" configure servlets and EJB.
 
2018: Write 100 lines of YAML to "declaratively" configure the micro services.
 
In XML, there were at least schemas
 
We are stuck, and no one will save us.
 
 

The business is still


 
Users also. They learned to accept what we are doing. We (the engineers) say that every application for Android takes 350 MB? Well, they will live with it. We say that we can not provide smooth scrolling? Okay, they get used to the phone, which podtormazhivaet. We say: "If it does not work, reboot"? They will reboot. After all, they have no choice.
 
 
There is no competition either. All build the same slow, swollen, unreliable products. A random leap forward in quality gives a competitive advantage (iPhone /iOS vs. other smartphones, Chrome vs. other browsers) and forces everyone to regroup, but not for long.
 
 
Our mission as engineers is to show the world the tremendous capabilities of modern computers in terms of performance, reliability, quality and ease of use. If we do not care, people will be drawn. And no one but us will show them that this is possible. If only we do not care.
 
 

Not so bad


 
Sometimes, in the cloudy sky, the rays of hope shine through.
 
 
Work Martin Thompson (
? LMAX Disruptor
?
? SBE
?
? Aeron
) Is impressive, it is refreshingly simple and effective.
 
 
Editor Xi Rafa Leviena, it seems, is built on the right principles.
 
 
Jonathan Blow for his game developed a compilation language that compiles 50?000 lines per second on a laptop. This is a cold compilation, no intermediate caching, no incremental builds.
 
 
You do not have to be a genius to write fast programs. There is no magic here. The only thing that is required is not to build software on the basis of a huge pile of crap supplied by modern tools.
 
 

Manifesto of a better world


 
I want to see progress. I want a change. To modern software is improved, not standing still. I do not want to reinvent the same thing, each time releasing an increasingly slow and bloated product. I want to believe in something - a worthy goal, a future that is better than what we have today, and that there is a community of engineers who share this vision.
 
 
What we have today is not progress. We barely achieve business goals with these poor tools. We are stuck in a local optimum, and no one wants to move. This is not even a good place, it is bloated and inefficient. We just got used to it somehow.
 
 
Therefore, I want to state: the current situation is total shit . As engineers, we can and should, and we will do better. We can have better tools, we can create better applications, faster, predictable, more reliable, using less resources (by orders of magnitude less!). We must deeply understand what we are doing and why. We must produce products reliably, predictably, with the highest quality. We can and should be proud of our work. Not just "considering what we had " - no reservations!
 
 
I hope I'm not alone. I hope that there are people who want the same. I will be happy if at least we begin to talk about how absurd the current situation in the software industry is ridiculous. And then, maybe, we'll figure out how to get out of it.
+ 0 -

Add comment