23

I understand that the game as a whole is around 80 GB, so at first glance, it makes sense that an update could be 15 GB. But isn't the vast majority of that game size graphic assets, like textures?

I'm sure the actual code is only maybe 500 MB or something, right? Even if they tweak something that affects the code of every quest in the game, why would it be 15 GB to download the patch on Steam? They can't be tweaking all the game textures every time there's a minor patch, can they?

5
  • 5
    Someone asked a similar question just the other day, see gaming.stackexchange.com/questions/402168/….
    – Timmy Jim
    Commented Jun 30 at 23:43
  • Maybe the new world is indeed significant!
    – m4n0
    Commented Jul 1 at 10:36
  • Perhaps the game uses a simple update mechanism that cannot update portions of files, but can only replace whole files. So if a 5Gb file has 10Mb of changes, the update size is 5Gb. Commented Jul 1 at 17:33
  • I think the real answer is some combination of the accepted answer below and the answer to the other question (gaming.stackexchange.com/questions/402168/…)
    – audiodude
    Commented Jul 2 at 1:07
  • 1
    @JohnGordon Steam's content system, SteamPipe, is pretty smart about updates. It's designed to only download the parts of files that have actually changed, even for large compressed files. This means you usually won't have to redownload an entire huge file if only a small part was modified. Most game developers structure their files to work well with this system, minimizing unnecessary downloads. Commented Jul 2 at 2:20

3 Answers 3

34

The size of game updates, even for minor patches, can indeed be surprisingly large. Developers from Fatshark, the studio behind Warhammer: Vermintide 2 and Warhammer 40,000: Darktide, have provided some insights into why this happens. The explanation revolves around the trade-offs game developers must manage between download size, installed game size, and loading times.

One key factor is resource duplication. As Fatshark_Hans explains:

In order to make load times faster, we duplicate resources in our resource bundles, so that you don't have to make many, many individual reads from disk. So for instance, if there's a torch that exist on most levels, that resource will exist in multiple level bundles.

This approach significantly improves loading times, especially for players with slower storage devices. However, it also increases the overall game size.

Another Fatshark developer, Fatshark_tazar, further elaborates on this point:

Basically it comes down to what is called bundling. Mechanical drives are really slow at seeking to new sections on the disks so to reduce the load times we basically have to take each "package", let's say a level to simplify it a bit, and take everything within that level such as models, textures, sounds, animations and much much more and create a blob of data that can be read sequentially without having to seek across the physical disks on a mechanical drive to find the file we need.

Fatshark_tazar notes that this method "reduces load times by at least a factor of five for mechanical drives." However, it also increases the overall game size:

The downside though is that since the data has to be put sequentially on disk which means that we have to duplicate A LOT of data across all of our packages.

This duplication strategy helps to ensure smooth gameplay across various hardware configurations. As Fatshark_tazar points out:

Even if you have a super fast SSD and don't benefit from the bundling you still have to wait for the person with the slowest drive to load into a mission to have everyone synced.

Fatshark_Hans mentions a solution to reduce the installed game size:

What we can do, is to release and update which is a 'remaster', meaning that you replace all your incrementally stored updates on disk, with the new 'master'. This would reduce the size of the install dramatically

However, this approach has its own drawback:

But then again, the download for that remaster update would be huge, because it would mean basically downloading the entire game again. Which isn't really feasible for a lot our players.

Fatshark implemented this approach for an update to Vermintide 2, as mentioned in their FAQ:

Q: Why is the download so large?
A: With the release of 2.0 (the patch that accompanies Winds of Magic) we made the decision to remaster the game. Whilst this requires one hefty download upon updating to the latest version, it comes with it some benefits which include both faster load times when playing, as well as a smaller overall file size on your computer.

The large size of minor updates is often a result of developers trying to balance the needs of players with various storage capabilities. As Fatshark_Hans succinctly puts it:

Making games is managing trade-offs. In this case it's a trade-off between the size of downloads, the size of the game on disk, and the time it takes to load resources off of the disk.

6
  • 25
    There can also be more technical reasons that nobody deems worth improving: imagine a file being compressed and just a tiny bit changed at the beginning, it is conceivable that this shifts things around so that the resulting compressed file is totally different in the end. There can be many more similar situations that add up and amplify each other.
    – PlasmaHH
    Commented Jul 1 at 7:20
  • 4
    Yes, this is the main reason why some games get huge patches. Many games put all of their non-code resources into one big compressed resource file. One tiny change to it becomes a huge patch out of necessity. This should be added to this answer. Commented Jul 1 at 18:19
  • 5
    @PlasmaHH `@ChthonicOne You make a good point, but Steam actually has a solution for this. Their SteamPipe content system is designed to handle exactly this kind of problem. It's designed to only download the parts of files that have actually changed, even for big compressed files. Most developers structure their game files to work well with this system, so updates are typically much smaller than they would be otherwise. Commented Jul 2 at 2:15
  • As PlasmaHH said though, in a huge binary file, removing or adding one byte early on in a file can cause a huge shift in all the bytes after that byte. The algorithm to detect changes like this is expensive, prohibitively so on very large files. Steam does not use it. So, as a result, Steam will patch everything after the byte in question, resulting in a huge patch for a very small change. Add to the fact that the file is compressed, this also means that things aren't just shifted, but encrypted with a compression algorithm, and it's even more complicated. Commented Jul 2 at 18:28
  • 1
    @ChthonicOne I believe most, if not all, game developers on Steam don't bundle resources into one massive file in such a way that a small change will cause users to redownload the whole file. Steam's guidelines actually recommend against this practice to make updates more efficient. Developers are advised to verify that update sizes match the actual changes made, and to investigate if there are unexpected differences spread throughout files. This means it's unlikely that a tiny change would trigger a massive download. Commented Jul 3 at 1:00
3

In addition to the issues mentioned by galactic ninja, and speaking as a developer.

It's often far easier to just reissue all of the files for a new build of a game than it is to effectively generate a patch-set that can be used to modify your existing installation. This is especially true for any binary, compressed, or encrypted files as these would essentially have to be deserialised/decompressed/decrypted, patched, and then reencoded.

In 2024, when most people have high-speed broadband, a lot of developers just don't consider it worthwhile to put in the engineering effort to make this work effectively, especially considering the other trade-offs.

1

Short answer: File systems are too slow for use in games. So you need to store all data in large files. It is complicated to perform and manage patches to existing large files. Its much easier to just have the user download the file again.

Long answer: For game loading you want as few read instructions as possible, and you want the read data to be sequential. Therefore most most game files are large packs containing elements sequentially appended to each other. You then have an index (giving the starting position and length of each element) that is loaded into RAM. This allows you to instantly, in one sequential read, fetch any element.

You can also sequence elements, even if this means repeating elements, so that one sequential read will load all elements for one 'level' of your game.

Then an update occurs... In the best case scenario the elements being updated are the exact same size. So you can simply over-write each occurrence of that element. But if many elements are changing their size... now you have to write an entire new version of the pack file by merging the unchanged data and the patched data to create a new sequence file.

The problem with doing that is if you patch version 1 to 2, then 2 to 3. Now you have to account for someone who wants to go from 1 to 3. Do you make them update twice? Ideally you have them download updates for 1->2 and then 2->3 and apply both. Now imagine over years how complicated this can get. It may end up that slower for the user to download all the patches and write the new file than just download the entire file.

Or... you can just not bother with patching and have them download the entire game again.

Often a combination of the above is used. Assets are patched when they can be. But when a larger change occurs, or a change to the patch system or how you index or sequence data, you just have the users download the entire game all over.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .