Gigantomania: why do modern games weigh so much?


The days when video games fit on floppy disks are long gone: today the average size of a AAA project is already at least 50 gigabytes, and some titles have come close to the 200 GB mark. On the one hand, this is caused by the rapid development of graphic technologies, but at the same time, banal savings on optimization play a significant role. But there were times when developers valiantly fought for literally every kilobyte!

40 kilobytes is enough for everyone


On July 15, 1983, miniature red-and-white boxes with the inscription Family Computer on the front panel appeared on store shelves in Japan. Despite the big name, these devices had nothing to do with real PCs: Famicom was nothing but the first full-fledged game consoles from Nintendo (except for Color TV Game 6, which was an Atari Pong clone).


The original Nintendo Family Computer for the Japanese market

The first pancake naturally came out lumpy: the console did not differ in reliability, becoming an object of criticism due to numerous hardware and software failures. However, after correcting all the shortcomings in the second revision, Famicom gained the status of the most popular game console in Japan: by the end of 1984, over 2.5 million consoles were sold, which at that time was an absolute record. And already in June 1985, at the Consumer Electronics Show (CES), Nintendo introduced the Famicom version for North America: the console changed its name to the Nintendo Entertainment System, also having undergone major visual changes and received an upgraded cartridge loading system. A year later, NES also visited the European market.


Nintendo Entertainment System. ?

In America, NES gained not less popularity than in Japan: initially they planned to release consoles only in the state of New York, but the first batch of 50 thousand consoles was sold out so quickly that three months later they began to sell NES throughout the United States. Although in the late 80s, Nintendo had a strong competitor in the form of Sega with its 16-bit Genesis / Mega Drive, NES remained the world leader in sales in the 90s. And even after the release of the Super Nintendo Entertainment System, the NES console continued to be released in limited quantities until October 2003, and its official support ceased only in 2007, which, given the rapid development of the industry, is an absolute record. According to official figures, in the history of the console, about 60.9 million devices and over 500 million cartridges for them were sold.

What is the secret to the popularity of NES? You can say: “Well, of course, in great games” - and you will be absolutely right. That's just not a single title, no matter how interesting and fascinating it may be, will not force the average parent to give the child a prefix (we do not forget that the success of the console is determined precisely by the mass market, and not by the demand among enthusiasts) if the device itself will cost fabulous money. To meet $ 200 (the price of a game console at the start), Nintendo engineers had to make many tricks and compromises. However, the fate of the game developers themselves was much less enviable: try to create an interesting game, albeit a two-dimensional one, the distribution of which will occupy no more than 40 kilobytes! And meanwhile, standard cartridgesprices for which it was also necessary to keep within a reasonable framework.


A cartridge for the Nintendo Entertainment System from the inside.

Each of them had only two memory chips: one for 8 kilobytes, the second for 32. Adding additional chips would inevitably lead to a higher cost of the cartridge and the appearance of “games not for everyone”: more technically advanced (when the bill goes to kilobytes, even a slight expansion of memory would not only improve the graphics, but also significantly diversify the gameplay) and, accordingly, much more expensive projects, which Nintendo did not want.

That is why Super Mario Bros. at the time, it made a splash and even got into the Guinness Book of Records as the best-selling video game in history: Nintendo developers knew all the hardware limitations of the platform and managed to squeeze the maximum out of 40 KB available, creating a platformer with 32 different levels, a lot of secrets and complex compared to most competitors) gameplay.


Super Mario Bros. Advertising Poster sample of 1985

And in 2018, the developers of the indie studio Morphcat Games decided to repeat their feat. They set themselves a very unusual and ambitious task, planning to create a retroplatformer that would take into account all the hardware limitations of the original NES. Looking ahead, we say that the experiment was a success: on May 1, 2019 Micro Mages really came out, and anyone who wants it can purchase the project not only in digital but also in the form of a real cartridge for the Nintendo Entertainment System.


The collectible version of the game Micro Mages, released for NES in 2019. The

toy received 26 unique levels in 8 worlds filled with secrets and various opponents, a lot of gameplay mechanics and multiplayer support for four people. To fit all this magnificence within 40 KB, the developers had to make a lot of effort. Here are a few tricks they used.

Available cartridge memory was distributed as follows: 32 kilobytes were allocated for the game logic and data storage, and the remaining 8 KB were intended for graphics storage. Of these, 4 kilobytes was reserved for sprites, and 4 more for rendering static objects. Surprises began already at the stage of creating the protagonist.

The architecture of the set-top box imposed the following restrictions: 1 sprite is 8 Ă— 8 pixels in size and can use 3 shades. The console itself can use only four palettes at a time and display no more than 8 sprites in each line of the image. We keep in mind that in multiplayer we will have four characters that must be different from each other. And here two problems arise at once. The first of them is that if there is an enemy in the frame, he will not be able to colorize and the antagonist will remain black and white.


However, it is solved quite simply: the same palette can be reused. Since the design of the enemy and the heroes is different, such a compromise will not be evident.


But since the prefix can draw no more than 8 sprites in each line of the image, then if four people connect to the game at once, there will simply be no space left for the opponents.


In the past, they dealt with it this way: sprites were drawn alternately at high speed, as a result, many dynamic objects could be placed in each line. In the case of CRT TVs, this approach was justified, but if you play on the LCD using the NES emulator, the flicker will be too noticeable. To prevent this, I had to sacrifice detail: now only 1 sprite has been used to render each character and ordinary opponents.


This approach allowed us to significantly save the available memory and add a lot of different animations of the protagonists.


But the most powerful monsters simply have to be big, scary and well animated. If we use a modern approach, when each metasprite (draws the whole boss) is one frame of animation, there is hardly enough memory space even for one such monster.


However, if you carefully analyze the design of the ghost, it becomes obvious that each frame uses the same repeating elements. For example, the top of a ghost’s head has only 2 unique sprites, and all the rest are repeated.


In the first frame, its central part can even be obtained by reflecting the original sprite relative to the vertical axis. The same can be done with the peripheral segment of the head, but already on all frames.


As a result, to draw the crown, you can do 4 sprites instead of 16.


Similarly, the rendering of eyes and mouth can be optimized.


Now the boss takes only 21 sprites and there is still a lot of space for other monsters and dynamic objects. Savings on the face, in the literal and figurative sense.


A similar approach was used to create the environment. By combining a set of tiles (tiles), turning and reflecting them around the horizontal and vertical axes, the developers assembled unique metatiles, and from them, in turn, metametals, receiving stone blocks with a complex texture.


Now the tileset has become much smaller, and 60 bytes are enough to store data for one scene (screen). Quite a bit at first glance, but due to the limited capabilities of NES, even this was not enough. Given all of the above, a logical solution is to reflect the level relative to the vertical axis and save 50% of the memory. But then the game design will suffer.


Symmetry is striking, in addition, the lower hall is walled up and the player will not be able to get out of such a dungeon. And here the developers have found an original way out. The index of each block takes 1 byte, and we have 96 of all such elements. In programming, the count starts from zero, so the index of the last block will be 95, which in the binary system is 1011111. This number contains only 7 digits (bits), but in byte them 8.


One free bit can be used to shift a row horizontally relative to all the others.


In our example, to get an asymmetric scene with a given layout, it’s enough to shift only two rows: the first and the third from above.


If we want to create a new level of complexity for connoisseurs of the genre, we can take the original dungeons as a basis by changing the color scheme and adding several new traps. Such a variety will be more than enough to make re-passing really interesting, while we will not go beyond 40 kilobytes for storing assets and code.


It was such ingenious tricks that helped developers create exciting video games several decades ago.

Optimization? No, I have not heard


Time passed, the power of game consoles and personal computers grew, but, despite this, the developers had to demonstrate the wonders of ingenuity. Recall the Legacy of Kain: Soul Reaver. Here our protagonist is not a man, but the Soul Reaver: a creature that can move freely between the material and spiritual worlds. The astral, although it is a reflection of the world of the living, is noticeably different from it (this is an important part of the gameplay, since obstacles in reality can be overcome by moving to the astral at the right time and vice versa) and is inhabited by unique monsters.


Official art of Legacy of Kain: Soul Reaver

To ensure a smooth transition between reality and the astral (and the game world at the same time changes literally in front of the player), two versions of the same location should have been simultaneously present in the console memory. If the game was chamber or divided into several levels, this would not have caused any problems, but two copies of the seamless PlayStation world could not be done: the console carried 2 MB of RAM and 1 MB of VRAM on board. Here, the developers had to resort to the background loading mechanism. Although the player does not notice this, in fact there is no open world in Soul Reaver: the vast expanses of Nostgoth are collected from many separate locations that are loaded right during the game - for example, at the moment when Raziel follows secret corridors or makes his way through caves in the mountains.


When Raziel enters the cave in the background, the current location is unloaded from the console’s memory and a new one starts to load.

In one form or another, this technique was often used in other games (coming out before and after): remember the famous doors to Resident Evil or elevators in the first Mass Effect. Sometimes the need for optimization gave rise to original artistic techniques, as was the case with Silent Hill: the foggy ghost town was just like that, because the developers needed to reduce the range of drawing in open spaces.


In the fog of Silent Hill, not only monsters were hiding, but also objects loaded on the go.

But in any case, there was no talk of scrupulously counting every bit. Today, developers pay even less attention to saving hardware resources than before, and the more expensive the project, the worse it is often optimized.

Gamers with experience probably remember what a lasting impression Crysis made. This fantastic action movie looks good even by modern standards.


After 13 years, Crysis is almost out of date from a technical point of view

Now admit, only honestly: were there any games after Crysis that produced the same wow effect on you? Not the depth of the gameplay, not the plot, not the individual pieces, like the facial animation in LA Noire or the physics of fire in Alone in the Dark, namely an excellent picture? Almost certainly not. Of course, if you compare the brainchild of Crytek with any modern AAA project, clearer textures, more accurate lighting, more polygons, and a greater range of drawing will catch your eye. But do we really need visual improvements, if now, in order to comfortably play the AAA release with a stable frame rate, you can not do without a top-end machine, the power of which will be excessive even for many professional tasks? Everyone will answer this question himself. The only thing that can be asserted with full confidence is that large game publishers need it.

The saying "Meet by clothes" is fully applicable to video games: the more beautiful the graphics, the easier it is to interest the mass consumer. If the target audience does not notice the quality of the emulation of the hair on the tail of the protagonist’s dog or the smoke generated from the cigarette in real time, which eat up 70% of the computing power of a PC, you can always tell about this during an advertising campaign: it’s not just that a good half is spent on marketing budget AAA games. Red and green nameplates on splash screens also appear for a reason: collaboration with the manufacturer of video chips requires the introduction of the latest graphics technologies that can encourage gamers to buy another Titanium 100500.

Often very interesting situations arise. Take Metro: Exodus, which uses raytracing. In order for the owners of video cards that do not support ray tracing to enjoy, the developers, in addition to the new-fangled technology, had to use the good old rasterization as well. Let's see what happened in the end.


The difference is not so obvious: if you look closely, you can see that with RTX turned on, the characters in the center of the frame are less well lit than soldiers standing closer to the windows. So it should be in life, which means that raytracing really works. That's just even on the RTX 2080 Ti (and for a moment, a top-end video card) the inclusion of this option causes FPS drawdowns up to 45 frames per second with average values ​​of 72. Moreover, in most other scenes the difference is not visible at all.


4A Games has already hastened to declare that they will abandon the old technologies and the next franchise game will use exclusively ray tracing to calculate lighting, because this will greatly simplify the development. Rasterization displays objects in turn, one after another, there is no connection between them, therefore, for lighting, reflections and shadows to look natural, artists and designers have to make a lot of effort. In the case of RTX, all the objects in the scene are initially connected: a ray of light is reflected from one object, hits another, adopting its color and reflecting again, and so on. This allows not only to make the lighting more realistic, but also to significantly simplify the work on each scene.

Of course, progress does not stand still, and it is quite possible that by the release of the next part, developers will be able to achieve truly photorealistic images and even mid-range video cards can produce stable FPS with RTX enabled. But for now, in the light of the minimal difference between “it was” and “it has become” and because of the need for a top-end map, which is choked with “magic rays,” this whole situation looks something like this: “You, dear players, pay more so that we worked less. " Although Sylvanas Trottier, who acted as the producer of Assassin's Creed 4: Black Flag, which had serious technical problems on the release, spoke out in 2013, saying the following:



“Due to hardware limitations, games for consoles require optimization. In the case of a PC, if a gamer lacks performance, he can just buy a new video card. "

And also a new motherboard, processor, more RAM and faster SSD. The latter, by the way, being initially the lot of enthusiasts, have long acquired the status of “must-have”, because the requirements for the performance of drives in computer games have grown enormously. And with them, increased requirements for their volume.

How much to hang in gigabytes?


Over the past few years, the size of game distributions has grown to incredible values. If earlier the most sophisticated projects with an open world and lots of content weighed several tens of gigabytes in strength, today even a linear shooter can calmly “eat” 150-200 GB of disk space, and not even frown. To date, the top 10 most difficult games are as follows.

  1. Call Of Duty: Modern Warfare - 175 GB
  2. Final Fantasy XV - 148 GB
  3. Gears of War 4 - 136 GB
  4. Call of Duty: Black Ops 3 - 113 GB
  5. Red Dead Redemption 2 - 112 GB
  6. Middle-earth: Shadow of War - 105 GB
  7. Call of Duty: Infinite Warfare - 101 GB
  8. Quantum Break - 76.5 GB
  9. Grand Theft Auto V - 76 GB
  10. Gears 5 - 66 GB

The numbers are very revealing: it immediately becomes obvious who is learning from his own mistakes and at least sometimes thinks about gamers, and who doesn’t. Gears 5 has become the largest and most technologically advanced game in the series, however, it weighs half as much compared to the fourth part and at the same time shows an excellent picture. It turns out that if you work on optimization, you can achieve impressive results.


Red Dead Redemption 2 can hardly be claimed. A huge world, excellent graphics and lots of content. Given the proprietary study of small things from Rockstar, this naturally requires considerable disk space.


At the same time, the volume of Call of Duty is constantly growing, although there are simply no special graphic frills in the games in the series, as well as an open world. Why on earth would a linear shooter manage to “get fat” to 175 GB? Given the speed with which new parts are released, the answer becomes obvious: Activision simply does not leave developers time to optimize.


Captain Price's whiskers cost gamers more and more expensive every year.

These numbers indicate that the need for a capacious drive is becoming more than obvious. But what about performance? Perhaps it will be possible to dispense with an ordinary HDD? Unfortunately no.

The table below shows the data on RAM and VRAM consumption by the most gluttonous games released in recent years (measurements were carried out on ultra-settings with a resolution of 4K).

A game


RAM,


VRAM,


Assassin’s Creed: Odyssey


7,5


7,1


Battlefield V


12,8


7,7


Hitman 2


9,5


8,1


Shadow of the Tomb Raider


8,9


8,8


Witcher 3


9,6


4,3


Metro: Exodus


8,5


5,1


Far Cry 5


9,8


9,9


Star Wars Battlefront II


10,5


8,3



How does all this relate to drive speed? Directly. In order for the processor and video chip to work with the game files, they must first be loaded into RAM and VRAM, respectively. The performance of the top gaming HDD is 227 MB / s. If you install Battlefield V on such a hard drive, which consumes a total of 20 gigabytes of RAM and video memory, it will take 20,000/227 = 88 seconds to fully load all the resources, that is, as long as 1.5 minutes! Of course, in practice, everything works much more complicated, but even such a rough calculation helps to get an idea of ​​the approximate download speed of the next card, on which a virtual battle should unfold.

Moreover, if in the case of "camera" projects, everything that affects the speed of the drive is the waiting time for the location to load, then when it comes to open-world games, insufficient reading speed negatively affects the minimum FPS. After all, even the most sophisticated computer is not able to store in RAM and VRAM everything necessary for drawing a 256 km2 map of Assassin's Creed: Odyssey: file downloads will occur directly during the game. Moreover, before all the necessary resources are in RAM and video memory, the processor or video card will not be able to prepare new frames, which is fraught with drawdowns of the frame rate in complex scenes.

In light of the above, the presence of a high-performance SSD in a gaming PC is no longer a luxury, but a vital necessity, and the drive should not only be fast, but also capacious, because over time, the appetites of games will only grow. Fortunately, even high-speed NVMe SSDs are now available today.

For example, consider a WD Blue SN550 SSD with a capacity of 1 terabyte. At the time of this writing, its price according to Yandex.Market data varies from 10,500 to 11,000 rubles, which is only slightly more expensive compared to high-quality SATA SSDs of the same volume.


And unlike a number of colleagues in the mid-budget segment, this drive uses four PCI Express 3.0 lines instead of two, which makes it 60% faster, demonstrating a stable speed of 2400 MB / s in sequential read operations against the maximum 1500 MB / s, when compared with NVMe SSDs using a stripped-down PCIe Ă— 2. The WD Blue SN550 performance characteristics are even more interesting compared to SATA drives, overtaking them by more than 4 times in performance.

Model


WD Blue SN550 NVMe SSD


Noname PCIe Ă— 2 NVMe SSD


WD Blue 3D NAND SATA SSD


Capacity


1 TB


960 GB


1 TB


Sequential read speed, MB / s


2400


1500


560


Sequential Write Speed, MB / s


1950


1000


530


Maximum number of random read operations, 4KB QD32 (IOPS)


410,000


120,000


95,000


Maximum number of random write operations, 4KB QD32 (IOPS)


405,000


100,000


84,000



Such a large difference is due to interface limitations: unlike parallel NVMe, serial SATA does not fully reveal the high-speed potential of flash memory. A theoretically possible ceiling for SATA drives is a data transfer rate of 768 MB / s, although in reality the maximum that can be achieved is 560 MB / s, while even one PCI Express 3.0 line has a throughput of 985 MB / s. The presence of four such lines allows you to parallelize data flows between the chips and achieve unprecedented performance (which is why, by the way, the older model of the SN550 line is the fastest).

It is also important that the WD Blue SN550 uses the 4th generation 3D NAND BiCS flash memory. 96-layer crystals have helped not only increase the volume of solid-state drives, but also increase their reliability: the flagship model can boast a resource of 600 TB overwrite, while a SATA SSD with a similar volume is 400 TB. In light of all of the above, the WD Blue SN550 can be called an excellent solution for building a modern gaming PC. In terms of price, performance and reliability, this drive is ahead of any competitors.

If you want to build the fastest car you can, and do not intend to spare money, you should pay close attention to a series of WD_BLACK SN750 NVMe SSD drives oriented to hardcore gamers. The eight-channel controller of our own production and a number of firmware optimizations allowed us to come close to the upper theoretical limit of the interface bandwidth: the fastest 1 TB black line drive boasts data transfer speeds of up to 3470 MB / s in sequential read operations and up to 3000 MB / s with sequential recording. As for the 2-terabyte flagship, it turned out a little slower (3400 MB / s when reading and 2900 MB / s when writing), which is explained by the use of more capacious multilayer chips of 512 GB each.But such a drive will certainly fit the ten most difficult AAA games without problems.


However, the SN750 is interesting not only for its excellent speed characteristics, but also for two important features. Firstly, NVMe SSDs received a rather massive heatsink, which, in combination with a changed topology of the printed circuit board, provides efficient heat dissipation both from the chips and from the controller itself, which allows achieving stable performance even under high load.

Secondly, the proprietary utility SSD Dashboard, which offers all the necessary tools for monitoring and servicing SSDs, has acquired an important Gaming Mode switch.


When it is activated, the energy-saving functions of the APST (Autonomous Power State Transition) group are completely disabled, which, by and large, are not needed by a stationary PC, which minimizes the delay time when initially accessing data. All this makes the WD_BLACK SN750 an ideal choice for high-performance gaming PCs, sharpened by modern AAA-titles, allowing you to achieve instant loading and stable frame rate even in very poorly optimized games.

All Articles