Everything is amazing in software, but everyone is unhappy



There is a typical position that can be found on Habré and not only: "although hardware is better over the years, humanity has reduced the effect to nothing by writing software worse."

Like, there are more cores in the processors, but it slows down more than ever. Electron and Slack are creatures of darkness that have come to rob us of happiness and memory. Mobile applications have become more voracious than older operating systems. And in the operating systems themselves there really is no progress, but for some reason they continue to swell in size. Whether it was when people knew how to fit the OS onto a floppy disk!

I will say directly: when I see such statements, they bomb me. In my opinion, they miss a number of important factors. And in the end, the situation resembles Louis X Kay’s classic speech “ Everything's amazing and nobody’s happy”": Everything was surprisingly good, and people are sitting and complaining.

Therefore, I decided to describe these missed factors. Since the text turned out to be harsh, I want to emphasize that my opinion is purely personal (although I refer to the conferences of the company in which I work).

Memory selectivity


It is known that there is a nostalgic effect "earlier the grass was greener, and the music was better." But, in my opinion, for many this effect takes on frightening proportions, when sensations quite dramatically diverge from reality.

I'm talking about statements like this: "Almost everything on a computer feels slower than in 1983." Judging by the thousands of likes, this is not an isolated opinion, but a massive one.


My first reaction: “Well, how was the streaming of 4K video in 1983?” That is, for starters, let's recall that most of today's computer applications were previously impossible, including just because of speed. Once a movie (not even in 4K, but in 1080p) would have to be downloaded for months, and then the computer could not play it with 24 FPS. And how then to compare the current speed with one that would be so prohibitively low that it did not exist at all?

Second reaction: yes, some things that have moved from the command line to the GUI may take more time there. I’m ready to believe that Word in 2020 starts more slowly than the console editor vi in ​​1983 (I can’t personally compare: it was the first time I was at my computer in the 90s). But if this is so important for you, you can use the command line for many “tasks from the 80s” in 2020 too. I'm writing this text right now in vim - a great editor that the plugin ecosystem helps keep up with the times. Looks great on modern retina iMac: letters have become very clear, and still works instantly. What is the problem?

But the most important thing is not even in the two named things. More important is the following: in my opinion, we began to forget how much computers slowed down. When something starts to slow down more strongly, we immediately notice it - but when something becomes faster or easier, we just take it for granted and forget about the past.

To all the people writing “everything used to be faster,” I have this question: do you remember, for example, this?


I admit honestly: I myself do not particularly remember this. When I turn off the computer today, I don’t think about the fact that I sat and waited twenty years ago between pressing a button in the OS and pressing a button on the system unit. I just took for granted that now it’s not so necessary, and forgot about what was different.

With the time the computer started, there were also big changes. Now I sit down at the aimak, press one key, and after a second it is ready to work (thanks to SSD and sleep mode). When in childhood I was waiting for Windows to boot from the HDD, I would hardly have believed that I would live to see that.

“When in a letter there are hundreds of pictures, it opens terribly slowly, sometimes even ten seconds!” I understand that this can be annoying - but listen, in 2000, to check the mail, I first enjoyed the modem trills for a minute, then I waited until the main page of the mail was loaded slowly, and then the letter itself was also not loaded instantly - and it wasn’t no pictures at all, just text. Today, this would open without delay. Maybe you just don’t need to hammer in a thousand pictures a means of communication that was invented for another? And before declaring “everything went bad”, let's think: how long would a letter with so many pictures open in 2000?

Or here is another memory lost in time, like tears in the rain. Such a joke was popular in the 2000s: "A slow computer is when you know the names of all Photoshop developers." For those to whom this does not mean anything: then spiral Photoshop was very popular in Russia, and each time it was launched, the Russians had to look at it for a long time:


Oh, I realized now that I know one of these people - Sean Parent spoke at our place C ++ Russia

Now compare. Today they joke "once it took two kilobytes of memory to launch a man to the moon, and now you need two gigabytes to launch Slack." Sounds bad, but do you notice the difference? No matter how much RAM Slack consumes, users do not encounter the situation “you can go have tea at the start of the program”. Everything became better, but we did not notice it.

Or here’s a revealing historical artifact: the Masyanya series “Download” (2002). Heroes are terribly worried about the disconnect, in which you have to download the file again.


Please note: the file they are downloading weighs 591 kilobytes. Characters are worried about having to reload half a megabyte. This is the real state of affairs in 2002.

For comparison, a fresh example from life. I had a small technical problem on my Mac, on Stack Overflow I found the advice “to install Xcode and accept its terms of use”. My reaction: it’s strange to download an 8 gigabyte IDE for a single click, but if it helps, why not.

That is, the size of the programs has increased over the years (at the time of Masyan from the “8-gigabyte IDE” people would move their hair), but it became much easier to download them, and our life has become much better.

Well, the last example in this part is about “swollen mobile applications”. Outraged posts like “At times” write about this.App sizes are out of control ":" since when did LinkedIn begin to occupy 275 megabytes on the phone ?! " In 2018, it was possible to read the complaint “after installing the applications I only had gigabytes for photos left.”

I won’t lie, these 275 megabytes of LinkedIn are also causing questions for me. But I recall how in 2010 on HabrĂ© they wrote that Alfa-Bank had a mobile application. It weighed 30 megabytes - today such a size would not have raised any questions at all. And then in the comments they wrote:



Do you know why? In those days, a Russian user could walk, for example, with HTC Hero. Let's look at its characteristics: "storage - 512 MB, 165 for applications." 165 megabytes in total for all installed applications! In such circumstances, I had to constantly choose which ones were more important and which ones you could live without. And to install a 30-megabyte one would have to demolish several others at once. It was a pain.

And if we could return to 2010, approach the people who are experiencing this pain, and tell them the phrase from 2018, “after installing the applications I only have gigabytes for photos left,” I think we would be beaten. These words would not sound a complaint, but a mockery and boasting.

And even from 2018, when this complaint appeared, the situation managed to improve: now the budget Xiaomi Mi A3 in the basic version is equipped with 64 gigabytes, so after installing the applications there will obviously be much more than one free.

Yes, applications have grown significantly over 10 years. But the volume of space at the same time has grown HUNDREDS. That is, life has become ten times better.

And do you remember what it was to use a smartphone in 2010 and how many different pains “everything slows down” there?

  • A small amount of RAM meant that the application constantly had to run from scratch, and not immediately switch to an already running one.
  • And low-power processors very slowly launched them from scratch.
  • I don’t want to remember the speed of mobile Internet.
  • , USB 2.0 .

Using a smartphone looked like a constant expectation: no matter what you want to do, this includes periods of "standing and dumb." If you are on the street and you need a map, you stand still and first wait until the application deigns to start, and then wait when the map in it deigns to load. And you pay significantly for the traffic consumed at the same time.

Compared to this, we all now live in heavenly conditions, but at the same time we are actively complaining. It looks like "problems of the first world" - something like "I want to read twitter lying down, holding the phone over my face, but then you can accidentally drop it on yourself." Well, I sympathize, but to say in such a situation “everything has become worse” looks like disrespect for the past and the present.

: « » . , , . .



«»


There is another typical situation. People say, "software is becoming more and more bloated and slow, although for the user there is almost nothing changing for the better" - and at the same time they are not completely aware of what is actually happening there.

As a small illustrative example. In 2018, there was a sensational text by Nikita Prokopov about “spoiled software”, and among other things there were the words “Google Play Services that I don’t use (I don’t buy books, music or videos there) - 300 MB that just sit here and which cannot be removed. "

Reading this, I want to be filled with righteous anger, yes. But there is a caveat: in fact, Google Play Services is not about “buying books”. It includes many different APIs, and, as reportedWikipedia, “All major Android services are controlled by Google Play Services.” That is, in all likelihood, Nikita is actively using this software, not knowing about it himself - it’s just that Google gave him a confusing name.

My goal here is not to criticize a specific post, but to show a general approach: we often say “applications swelled / slowed down for no good reason”, not fully understanding these reasons. They added something to the application, its developers have detailed information about what exactly and why, but we don’t have it - but for some reason we consider ourselves more competent than them in this matter.

But inside there is a lot of what is not obvious from the outside. Take text editors as an example: it would seem that people have been making them from time immemorial on the weakest hardware, everything has been clear with them for a long time and there is nothing to slow down, "which could be easier." Therefore, from the outside it looks like this: if something does not work out instantly, then these crooked developers have forgotten how humanity is already skilled. But if you dive into the topic (we somehow published a habrapost about this), a lot of unobvious nuances are sharply revealed - and the words "what could be simpler" cease to look convincing.

Or one more thing. News about mobile OSes often provokes such a reaction: “Over the past eight years, nothing useful has been done there. Each year, with a fuss, they roll out a new version that weighs more than the previous one, but from the differences I see only new emojis. In the coffin, I saw these emojis, return the old version, it was better. "



Listen, well, it seems that all IT specialists know from experience: you can spend a lot of effort on justified refactoring of the backend, and the user will not even notice it, or you can change a couple of icons in the interface in half an hour, and heated discussions will begin among users. That is, we know that “outside” only some changes are noticeable, and often not the most important ones. Why, then, we don’t realize that if we ourselves only notice emojis in the new version of the OS, then this speaks more about us than about the OS?

Take, for example, Project Treble from Android 8.0. Android is known to have a big problem: while iPhones are easily updated to new OS versions, android phones usually stay forever with the pre-installed version, because their manufacturers do not want to fool around. And Google started a large-scale alteration of the entire architecture in order to simplify manufacturers' lives and stimulate updates. And although Treble completely did not solve the problem entirely, statistics show a noticeable improvement. That is, in order to deal with an urgent problem, Google invested a lot of work (refactoring such a colossus is not a good thing for you to ride on the window), and the situation has been partially improved. In my opinion, the company just needs to deal with sore problems, they did everything right.

Now tell me: if you are not a mobile developer, have you heard about this at all? When you first picked up a phone with Android 8.0+, did it somehow affect your opinion about the new version? It is unlikely, because in the first months of using the phone, this does not appear at all to the user. You can only notice after a time when the new version of Android will be released. And only if the phone manufacturer is among those whom Treble has been able to update devices. And even in this case, the user may not realize that the update is due to Google, and still say "nothing useful has been done for eight years."

And this situation is typical - the “underwater part of the iceberg” is generally large. When Android introduced the Adaptive Battery (“smart” definition, which applications can have a battery in the background), did you measure whether the power consumption of your phone has changed? When Google Play Protect in the background takes care of your security, do you remember that it wasn’t before? When you added support for the AV1 video codec, did you think that the future with this codec is very likely and such support is useful? Or do you just pick up the phone, and emojis are striking, but AV1 support is not striking?

After the release of this post, a wonderful thread appeared fromlanythat software swells largely due to the handling of rare situations. And then what happens: for 1% of cases when these situations arise, everything became much better - but 99% of people who do not encounter these rare situations cannot properly assess the changes.



And from this it follows that when we swear en masse at all “swollen and inhibitory applications”, in many cases there were objective reasons for this. Somewhere, during working hours, people thoughtfully weighed all the pros and cons, discussed them with each other, and came to the conclusion that the pros outweigh. And then we come, we have no idea what was on the scales, we are not going to spend time on thoughtful study, we only see the changed size - and we make a confident conclusion “everything went bad”.

The general conclusion: some of the claims about "senselessly bloated software" are caused by misunderstandings. And making such statements without a thorough study of the causes is not the most sensible strategy.



Performance Attention


Statements about “spoiled software” sound as if developers used to save computing power, but now they all wanted to spit on optimization. Like, it’s only important that it works - and that the requirements for iron are high, so Moore’s law will somehow ruin everything. If the code is executed slowly, it eats memory as if into itself and takes up a lot of space - and so it goes, no one will worry about it and improve something.

Listen, this is simply not true. As a separate example: Facebook Messenger has recently been rewritten, and it is reported that productivity doubled and size decreased fourfold. And to the words above that the LinkedIn application weighs 275 megabytes - I checked it now and saw the words “195.4 MB” in the App Store, it seems that they also managed to reduce it by one and a half times. That is, both companies clearly thought about the consumption of resources, and allocated a lot of labor to reduce them, which could be spent on “sawing off features”.

And since I work with conferences for developers and intersect with many of them (and from different stacks), I can see which topics they care about. What they listen to reports about, what they talk about and write posts.

And productivity is exactly one of those topics. By the names of reports like "Optimizing the launch time of iOS applications"it can be seen: developers are willing to spend 60 minutes to listen about milliseconds.

They talk about performance everywhere - even in the JavaScript world ( "JavaScript performance through the spyglass" ), at least in mobile development ( "How to fit a million stars in an iPhone” ). But most of all they say in the backend. I think this is because users buy smartphones for themselves, but in the backend, the company pays for computing power, so there is a powerful financial motivation to optimize.

I recently did a simple experiment: I opened the program of our Java-conference JPointand looked at how many performance descriptions there was about performance. Later, because of the coronavirus, we postponed the conference to June, so that the program may change, but the results are indicative in any case. They are:

  • “This talk will be about Producer performance tuning.”
  • “Consider file I / O optimization methods”
  • “Maybe for one of the modules you want more performance than you can ever squeeze out of Java?”
  • “What safepoint-related optimization does HotSpot JVM do?” What should developers remember to avoid unwanted pauses? ”
  • “Valhalla project, inline types and everything around them, from software model to performance”
  • "Optimize query performance, throughput and memory consumption"
  • “The report is devoted to a detailed analysis of how the process of writing to the Apache Cassandra database occurs in terms of performance.”

Listen, if I conducted this experiment as a drinking game, then by the end I would have been drunk. With the naked eye you can see: Java developers care how they make their code not only working, but also fast.

Moreover: often it even excites them too much! From people who are closely involved in performance, I have repeatedly heard that it’s easy to overdo it. For example, for the sake of a slight increase in productivity, people use some kind of dirty hacks, which in the end create more problems than they solve. Well, about it was in the beautiful keynote of Alexey Shipilev at JPoint 2017, we did the decryption for Habr .



To summarize: while the Internet laments “the developers were lazy and completely stopped thinking about the brakes”, in reality, many developers think more about them than necessary .



Meaningfulness and expediency


And now, in my opinion, the most important thesis.

Yes, applications over time require more and more resources, sometimes by orders of magnitude. Yes, we began to drag much more into our projects, even when there is something not too necessary there, instead of spending time on isolating the “strictly necessary”. It's true.

But I believe that grown applications do not make today's developers inept fools. Moreover, the opposite is true:

Developers would be fools if applications had NOT grown.

Imagine a world in which hardware is actively developing, and the approach to software development remains the same. Programs would still fit in the volume of the floppy disk, although no one would have used the floppy disk itself. In the name of performance, just about everything would be written in C ++. All developers would constantly go down to low levels and know every place where a couple of bytes can be extracted. There is no “now we’ll install five libraries for this task”, only solutions that have been carefully verified for a specific project, in which there is no extra line. In general, a holiday of optimization and a careful approach to resources. All that is now regretted as a "lost skill."

Do you know what impression this picture makes on me? A large family was forced to live in one small room, so she learned how to place things with millimeter accuracy and mastered the secret art of creating five-tier beds. And then she moved to a spacious multi-room apartment - but out of habit she took only one room there, she left all the others empty. People are still sitting on each other's heads.

Question: Do you think this is crazy? If it was possible to give everyone plenty of space, then why save it? Who better from the ancient craftsmanship of five-tier beds, if at the same time there is nowhere to invite someone to visit? Is it time to master the new skill of choosing a king size bed?

The same is true in software. If in an era when even budget smartphones have 64 gigabytes each, we will experience “Kotlin runtime will increase our application by megabytes”, then we will turn into this family. Relax, a place for that is necessary for something to be in it. If it is wasted, it does nobody better. If there is a lot of space, and for some reason a piano is needed, then you can safely set it up without worrying about square centimeters.

Slack - this is the piano. Okay, he eats a lot of memory, but have you heard non-IT people complain about him like they used to complain about Photoshop? According to my feelings, ordinary users already have enough space so that this “piano” does not prevent them from “walking around the room”. Yes, you can replace it with a piano and save space. But if you live in a palace, and over the years the area of ​​the palace becomes even larger, then why?

In my opinion, because of Slack, it’s mostly experienced IT people who need RAM for other difficult tasks. But IT people are a different story: developing an apartment metaphor, these are people whose workshop is made from an apartment. Well, yes, no matter what palace you provide, all the rooms will be occupied with something very important. And then the piano can interfere, because you can put a lathe instead. But most people in the world live differently, they don’t need lathes at home, and there are already enough resources for their lifestyle.

And in such a situation, it is feverish to avoid everything that takes up memory, when there is already a lot of it - it's like with mobile task managers, where users beat up running applications in order to “clear RAM”. Many times, the creators of mobile platforms have said: “Stop, madmen. The system itself will kill applications if there is not enough memory, and why should they be nailed before? What is the point of buying a smartphone with a large amount of RAM, and then keeping it empty and not using what it paid for? ”


It looks optimized. Do you want to live here?

And I also remember the story with the "2000 error." Someone from IT specialists of the middle of the XX century in old age recalled: “We were among those who gave birth to it. We then saved every byte. And when we came up with the idea that you can keep the year in two numbers, not four, we felt very smart. We saved two bytes in a heap of places! And only towards the end of the century did the consequences become clear. ”

Yes, at that time recording the year in two digits probably seemed like a reasonable decision. And it directly embodies all that advocates of a “careful” approach advocate. This is where they really went to the limit in caring for resources.

But then, many years later, other people had to spend a lot of resources to figure out the consequences of this decision and prevent problems.

And now, when the saved two bytes will no longer help anyone, and the negative consequences are known, it would be very strange to keep the year like this.

So: much of what I want to sing today as “a reasonable conservation of resources” in the future will look like the same “saving on matches”, which spent more resources than it saved. For example, because the time of the developer is also a most valuable resource, and it is often not included in the calculations here. If you have gone beyond data alignment, and your application has become a little smaller and faster, but you spent a week on this, which could have gone to a useful feature, did you make the user better or worse?

, « , , ». .

— . , , .

, - ( ), . .




I anticipate the objection in the comments: “But task X began to be carried out really slower for me than 13 years ago, give me back my 2007th.”

Do not get me wrong: I'm not trying to say that there are no such tasks. There are . And overuse of dependencies does happen. And bloatware happens. When I hear that there is video editing in Photoshop, I experience the same “why” feeling like you do. When I read that creating any application with create-react-app immediately means 4304 directories with 28678 files in them, I also have a question whether we have gone somewhere else. There are many real issues worth talking about.

My only claim is that in connection with these problems there is a kind of radical sect that believes in the near end of the world due to the proliferation of software. They rewrite history in this sect (“it used to work faster!”), Misunderstand what is happening (“why Windows has grown over the years, nothing really changed there”), undeserved things are said about developers (“they don’t want to optimize anything ") And go to extremes" let's save every byte, even when it makes users worse. " Let's not do that.

My objections to the radical position, of course, also turned out to be somewhat radical (just in the opposite direction). You can also find fault with them, and they can also be called a sect. But the purpose of this post is not to make every word of mine considered the ultimate truth. Its goal is to make people less involved in sects.

The main conclusion is boring and banal, but no less correct: it all depends on the specific situation. Optimization is not absolute evil and not absolute good. They can both help and harm. There are situations where optimizations are clearly worth it, and situations where they are clearly not worth it. And there are intermediate situations where different people will regard “worth it or not” in different ways, and none of them will be more right, and this is normal.

And if someone, including me, tells you otherwise - this is a sectarian, chase him in the neck.

All Articles