Jump to content
Official BF Editor Forums

Dices Home Made Troubles

Recommended Posts

I wanted to post this quote from Mikael Kalms (DICE dev working on patching bfbc2 - seemingly alone!) on the EA Forums. To me its an interesting read and gives an idea of the fucked up way DICE is working at the moment. Those of us who have been looking at the fbrb files in detail may be particularly interested.

Developing a game is developing a software suite, and a dataset that will go along with the software. Users will use the software (= run the game executable) to manipulate the dataset (= use mouse/keyboard/joystick to control the game).

A good game requires that the capabilities of the software matches the dataset that is being created. The software is usually refined in parallel with the dataset. In order words, the game engine and the game worlds are tailored to each other, to some extent.

The programming side of making a game corresponds fairly much to developing other kinds of software. You usually use mature languages, standardized tools, and techniques which were pioneered back in the 1960s to create a set of source code, which when built creates a single game executable.

Creating the content is less straight-forward. Sometimes there are tools that do the job well (Maya, Photoshop, SoundForge, Cubase etc). For other kinds of content, there are no good tools, so the game developers develop their own.

Raw source format is not a good format for distributing a game engine to end users. One could ship the code as-is, but that would require people to have the right compilers and software SDKs available on their machines. Distributing the code in the form of a game executable is more practical.

Raw source format is not a good format for distributing the game content either. It is convenient for editing, but the game engine will usually want the content in a different format before using it. The raw source format is often bulky, and the conversion step is lengthy. Therefore, game developers usually create custom tools which "cook" the data -- convert it from source format to something that suits the game engine better.

Cooking is good, and bad.

No cooking gives you the advantage that you can change a source file, restart the game, and the effect happens immediately in the game. It is usually easy to figure out which files on-disk contribute to what in-game.

With no cooking -- or just too little cooking -- you get very long loading times. You usually get more memory fragmentation. You also lack mechanisms to validate the integrity of the data; if you want to see that it's consistent, you have to play through the full game and exercise all aspects of the game.

Cooking gives you the advantage that you can do lots of sanity checks on the data before even starting the game. You can improve loading times a lot (especially off media with slow seek times such as DVD), and you get less memory fragmentation.

You can also create extra datastructures, which improve runtime performance. (A good example of this is the BSP & PVS trees that were used in FPses back in the 90s.)

With too much cooking, you find that it is difficult to change anything when data already has been cooked. If you want to edit just one tiny little detail, you have to re-cook ALL the data. It is difficult to tell which files on-disk contribute to what in-game.

Now let us consider the Frostbite engine and where it comes from.

It was initially used to create BFBC1. This game was released only for consoles. This means that the team which developed

BC1 had to do a certain amount of cooking - mainly to avoid excessive load times and memory fragmentation. The hard memory and performance constraints of running on a console also made it more important to pre-compute some things, and to package data into suitable archives.

With this foundation, we essentially had a game engine which solved a lot of time-consuming problems for us when we began on BFBC2 PC. Loading times would be under control, it's easy to figure out which files go into which archives, and which files/archives belong to which level. These are things which are often overlooked when not using a game engine that has been used to ship games on DVD.

We also wanted a way to automatically patch the game. Making an auto-patcher that works properly under Windows XP, Vista & Win7, knows about limited users & UAC, and can handle restarting mid-way through a patch at all takes a huge amount of time.

Therefore we took the auto-patcher from BF Heroes and modified it slightly. Voila, the game could now pull down updates from the internet and apply them to its own datafiles. We were all set.

Or so we thought.

Some complex systems seem simple on the surface; it is only when you look under the hood that they turn out to be tricky.

The tools which "cook" the game's datafiles takes in a set of original files, which are roughly 80GB in size. Most of the files here are organized by function ("this is a mesh, this is a texture, this is an animation, ...") rather than location (on which level it is used). The tools will process the dataset once per level, extract the subset that applies for the current level, and convert it to a format suitable for the game engine.

This allows the tools to do some per-level precomputations easily; for instance, since all the pixel/vertex shader combinations that will be used throughout the entire level is known, the tools pre-generate all these combinations and store them in a "shader database". (BF2142 generated & compiled the shader combinations during first load - that's one reason why first launch of a level was very slow there.)

After this is done for all levels, there are a bunch of archives for each level. This is ideal for quick loading times and no memory fragmentation, but it wastes diskspace unnecessarily. The result is probably about 15GB in size.

In order to make a better tradeoff between diskspace and load times, an extra processing step has been added; all the level-archives are compared, and any datafiles which appear inside many of these level-archives are moved to a level-common archive. So when loading a level, everything inside the level's own archives and the level-common archive has to be processed. This reduced the total data size down to about 6GB.

This technique allowed BFBC2 to fit on a DVD, both for the console and the PC versions. It is not perfect by any stretch, but dealing with these large amounts of data is time consuming, and therefore you don't try every idea under the sun - rather, try what seems most likely to work first, and then keep on until the end result is good enough.

So this is all awesome when shipping the game. Where do the problems begin?

When you begin creating patches.

First off, it turns out that the tools that "cook" the data don't produce binary-identical data all the time. The result after cooking is always functionally identical, but the bit-content may differ slightly. Items in unordered lists change order, uninitialized fields contain random data, that sort of stuff.

Why didn't this get caught sooner? Because you only notice problems of this kind when re-cooking the same data several times, from scratch. And re-cooking all BC2 data takes over 48 hours for a high-end computer. And locating & correcting all places where cooking isn't deterministic down to the bit level would take a lot of time (both calendar time and effective development time). Perhaps that time is better spent elsewhere?

So. If different "cooking" runs produce slightly different results, it is suddenly difficult to look at version A and version B of the data and answer the question, "what are the differences between these two datasets?". It's easy when looking at the source data, but when looking at the cooked data there are a lot of changes which have no effect on the final game experience.

There are about 40.000 source files, and this results in well over 100.000 cooked files. Going through those by hand is not an option. Writing a perfect filter, which knows which differences are benign and which are for real, will take as much time and effort as making the cooking 100% deterministic. Neither that is an option.

So you make a filter which does something in between; be smart, create rules for the file types you know about, and when in doubt - assume that the change is for real. Err on the side of caution.

Then you realize that those shader databases were never designed to be extendable. What happens when a new object is added to a level in a patch? Its mesh & textures are included, no sweat, but what about the shader combinations? How does one create add something to the shader database, when the shader database is an opaque binary block whose entire contents may change when just one object is added to the level?

(One shader database is about 5MB. There are three shader databases per level - one each for DX9, DX10 and DX11.)

And finally, the patch system itself. Yes, it can replace portions of files on-disk. But due to its heritage (from BF Heroes), it is not able to open BFBC2's archive files and apply differences to individual files within the archives.

The only straightforward way is to make all patched-in content arrive in archives on the side of the original archives.

Given the above scenario, we end up with the situation that we have today.

Each patch gets larger than the previous, because the game drifts ever further away from what was shipped on the DVD. Changes that require shader database updates make the patch balloon in size. And we have to be careful and clever when selecting which file types to include and which to ignore when creating the patch.

And that's where we finally ran into real problems. It was too difficult for one person to identify which changes were required and which were not, and how to update the patch generation process to accommodate the latest set of changes. Most of the delay of Client R8 was because there are very few people at DICE who have the in-depth knowledge of the far-spanning corners of the game engine *and* the cooking tools *and* the patch generation process, to work out what is going wrong, why, and how to fix it.

The new content did work a long while ago - but the patch was back then approximately 7GB large. The patch had to get down to less than 1GB, or else some customers in Australia and South Africa would not be able to download it due to bandwith caps.

[As an aside - how about distributing the patch as an installer, for those who prefer that option? We would love to do so, but creating an installer that does anything out of the absolutely most ordinary installer-y requires ridiculous amounts of development time.]

I think that we have a proper Client R8 patch ready for release, but the approach we have been using thus far has been taken as far as it can go. (A bit too far, considering the delays.) We want to continue supporting the game, but if we want to change anything else than the game executable itself, we will need to spend time on figuring out how to do so with a less error-prone patch generation procedure ... and preferably without generating as large patches.

This stuff keeps me awake at night. What are your demons made of?

Aside from the fact that he basically says there wont be another BFBC2 patch anytime soon their choice to just reuse the heroes updater without even adding the ability to open fbrb files (what 1 or 2 days work max?) is truly astounding. To me that attitude and approach really doesn't bode well for BFBC2, BFBC2:Vietnam or BF3.

Edited by PiratePlunder
Link to comment
Share on other sites

dunno what to say about this :blink:

welcome to the future of gamedeveloping :/

iam very wondering about dice's/ea's marketing strategy in the past 2 years... all that strange stuff with bf1943, heroes etc...

for me it looks like a group of pretty druged modders coming together, hanging around in a huge room for years.

prolly a room of druged modders would be the better solution as that weird shit what dice produced during the past couple of years :lol:

they had promised so much to the community and not even the half was like they told us it would be.

but well its summer here, time to get some tasty ice-tea and another actinocutitis B)

Edited by stevenB187
Link to comment
Share on other sites

That means that I have to spend a day downloading 900MB with my ultra slow connection for some minor tweaks they made because they can't access a single file of the game?

Online games strongly depend on regular updates and patches, see Bf2 - so why did they create a engine that is so hard to update then? Quite funny if you keep in mind that DICE has and will never release any Battlefield game that works without any patches from the day it was released.

Most of the delay of Client R8 was because there are very few people at DICE who have the in-depth knowledge of the far-spanning corners of the game engine *and* the cooking tools *and* the patch generation process, to work out what is going wrong, why, and how to fix it.

This passage concerned me the most, I thought DICE developed the Frostbite engine themselves and now they tell us they don't have people with in-depth knowledge?

Link to comment
Share on other sites

That means that I have to spend a day downloading 900MB with my ultra slow connection for some minor tweaks they made because they can't access a single file of the game?

that's the same system they used with BF2 1.5 - just replacing the archives outright because they couldn't be buggered trying to update only the few files that needed changing (which is what RTPatch did with previous BF2 patches)

i figured that even with the included booster content, BF2 1.5 should have come in somewhere around 500 - 600mb max if it was handled more efficiently, instead of the monster 2gb we ended up with.

i updated BC2 yesterday and it took friggin' AGES and i'm still not sure what the benefits were for singleplayer. needless to say i won't be updating it again without good reason...

Link to comment
Share on other sites

wow!...what a freakin mess...go back to your roots DICE,you know you want to :lol: .


all the guys who know how to tweak the frostbite engine are probly too busy working on MOH.


its becoming more and more disturbing that dice keep making new games and not finishing the games they have released.

Link to comment
Share on other sites

Its interesting that he doesn't mention one of the main reasons for 'cooking' the data - to make it as inaccessible as possible for modders :P

The 'update' problems have been solved long time ago for other games ( like the RTPatcher mentioned by clivewil) . But if they want to 'cook' everything on their own its their own fault to not test the system indepth before releasing their semi-finished files. Simply relying on fool-prove software was obviously not an option for our swedish friends.

Still glad i didn't spend money on BFBC2.

Link to comment
Share on other sites

Msch - exactly. Sure theres no doubt a lot to be gained by 'cooking' some of the files. But the dds textures - there's almost no additional compression there as remdul so perfectly showed within 24hrs of the beta release :lol: Clearly done just for secrecy and as was proved it achieved bugger all.

Not adding the functionality to update the fbrb files is just shockingly lazy. Seriously it's not difficult - the only difficulty is is reverse engineering some of the bizarre compression/no compression rules DICE use, but they'll know those before they start.

Also how difficult would it have been to include some kind of simple file check to make sure that if your 900MB update fails half way through you don't have to start again from the beginning. I'm on update number three and counting!

It's a shame cos I'm actually really enjoying BC2. Definitely prefer it to BF2 - different gamemodes, more intense fighting, no jets, fairly good balance (heli's and armour are quite vulnerable thanks to tracer dart) etc. Of course some will think they're not plus points. To me if it had mod support it would blow BF2 away.

Ultimately for me its this lazy attitude that worries me the most about DICE at the moment - which is also making me more and more sure that BF3 won't have any mod support either.

Link to comment
Share on other sites

And then that brings us to the flipside. While Bad Company 2 might be a mess that we can only wade through and make minimal progress with, it sounds to me like they might be rethinking the extent they cook the data for future DICE games on the Frostbite engine. DICE will probably do something like make a "patch" folder to get loaded afterwards, and modify the engine. That way it would be easily managed patches.

I don't think the .res files were done for secrecy so much, I'm sure it's because the DDS compression is easily implemented, provides good filesize, and looks pretty good. So the modified header may work better to provide the engine with information it wants... Maybe. I'm sure they modified the format slightly though, because a lot of games do that with their DDS clones (like swap channels or do something special with normal maps). I don't think it's laziness, but I think it's because they over-optimized and no longer recognise their own creation. The beast has escaped!

Link to comment
Share on other sites

You may be right that its not so much due to secrecy - although if not its probably a console thing. Still see no reason why you would want to change the DDS textures on a PC platform - you can swap channels and use them for whatever you want whether its a dds texture format or a bespoke dice format. All the information is there.

As for the patching thing, I'm not so sure how much will change. Reading that post from DICE, there seems to be three issues.

1) Asset archives are split by level, rather than like bf2 where they were split by type/nature of the asset.

2) Shader archives precompiled - rather than one first map load like bf2.

3) Lazy auto update app can't modifty fbrb archives.

To change (1) would be fairly simple but would surely have a significant performance hit - surely thats why they did it in the first place. To change (2) would be a little bit more complicated but would simplify things (at least on PC) - however at just 5mb per level is it really necessary? (3) is simple to change and should never have been like that.

There is then that comment about file sizes changing each time they cook the same object. Now that doesn't directly effect patching other than that it must be a pain in the ass for whoever is building the archives. Interesting to think about what is happening there?

Remember at the very beginning we were finding some files of the same type with headers some without headers and some with different sizes of headers - those very first bink videos were a good example. Surely that can't be as simple as different operators using different cooking settings or older versions of the cooking routines? Yet otherwise its hard to know why/how you could make a compiler (bad word for it as the body of the files remained exactly the same - the files were either exactly the same size and just had to be renamed or were actually slightly bigger due to the RES header being added - again makes you wonder why they had them as RES files) that takes exactly the same file type and creates different outputs?

Link to comment
Share on other sites

Exactly the file size issue is at the cooking stage - nothing to do with patching. Unless they fairly frequently cook everything at once [i imagine this is the case] rather than just a file at a time I don't really see the issue. Pain in the ass comparing file size as a method of distinguishing updated files i guess.

Either way, its got to be sloppy code at some point. Seems to be part and parcel for DICE at the moment.

Link to comment
Share on other sites

that's the same system they used with BF2 1.5 - just replacing the archives outright because they couldn't be buggered trying to update only the few files that needed changing (which is what RTPatch did with previous BF2 patches)

i figured that even with the included booster content, BF2 1.5 should have come in somewhere around 500 - 600mb max if it was handled more efficiently, instead of the monster 2gb we ended up with.

So we basically redownloaded the whole core game with that patch?

Also, is this the reason why BF games have patches that create more bugs than fix or is it just the latest trend with Bad Company series?

Link to comment
Share on other sites

I think people just kick up so much of a fuss on the forums that DICE rushes to fix the bugs without thoroughly testing the implications of their fix.

The 1.5 patch was funny, I love DICE's use of the placebo effect. They said they'd made J10 worse and F35 better, and most players seemed to think they did, but in reality the only thing that changed was the "modified by" date, it was still byte-for-byte the same. So I suppose it's just a psychological balancing.

Link to comment
Share on other sites

Hmm, this was based on what I saw in the 1.5 Public Beta, so I suppose it's possible they made the change after the beta.

But yeah, since the J-10 was my favourite jet I crawled through byte-by-byte in the Beta patch and found that neither the F-35 nor the J-10 had been modified in the Beta.

Edited by UberWazuSoldier
Link to comment
Share on other sites

they made the change after the beta

This makes things worse. Its those kinds of patches that requires DICE to always 'fix after the patch' and we end up with , uhm wait - 4 or 5 patches for BF2 1.0 :o I wonder why they beta test and then tamper again with the files afterwards.

Anyone remember the BF1942 patch orgy ? We had about the same amount of patches there.

Edited by mschoeldgen[Xww2]
Link to comment
Share on other sites

Battlefield 3 wont have modding abilities - said for a guy from dice. True

yeah i heard that too.

no bots, no modding, no reason to buy it. ah well, at least i'll save $100.

it still looks to me like ArmA2 is the only contender left as the next moddable Battlefield-style game.

Link to comment
Share on other sites

at least i'll save $100.
Are you saying they really plan to charge that much ? - and probably 'download only' ?

That really does it - goodbye DICE and i'm going back to mod BF1942 - besides BF:V the best game to mod in the whole BF series :P If it only would support more than one CPU core :(

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...