It's weird, though, because a lot of musicians from way back still have their masters, mixes, etc.. Not sure why code in gaming isn't kept the same way.
This is the part of the show where I get to say (in all kindness) from an american commercial:
Three things were common back in the 90s that caused a lot of problems: tape backups, really expensive versioning controls, and studio instability.
Tape backups are often more volatile than other methods such as optical media, HD backups, and especially modern data centers. Expect a certain amount of corruption over time even in good storage conditions. Also expect that the purpose of tape backup was never for posterity, but for emergencies during and just after development.
Versioning is in really good shape right now, but systems during that decade, if you had one at all, like SVN used to be a much bigger pain to work with. Code integration and merging could sometimes suck if your dev practices weren't tuned to perfection (and they often weren't in old skool studio environments). We have excellent practices now, but it's a more recent development than people realize. The gold standard for versioning, git, only came out in 2005. And many other systems iterated on the standard git set.
Third, studio volatility, is something that's still with us. It still affects source code now, but with advances in backup and versioning, it's not nearly as bad. Also publishers and studios have realized that it's in their best interests to have best practices. Picture the chaos going on in Interplay as multiple groups start getting laid off, transitioning to new studios, or leaving the industry entirely. Motivations and personal interests start diverging and it's very possible for things like code backups to get lost in the mix *even though it's essentially the primary product they sell*.
Anyhow, your analogy is understandable but not really correct. Music studio work comes from a different set of traditions than code development. The art is at the forefront of things. Code development extends from electronics engineering and because it was originally understood to be "the instructions to make the computer run", devalued what code could do. It's grown to take over society since then and glommed on a bunch of very human practices (like readability, modular chunking we call "objecty orientation", variable naming, and comment practices) that are a kind of business management, sociology, math, and linguistics hybrid. Art came late to the scene.
What we see in game development, except for the single indie auteur like ConcernedApe (Stardew Valley), is honestly more like movie making: an absolute mess of conflicting interests that swirl together like a tornado under the guidance of a director and producers. Music studios do this too, but remember that there's only one product at the end: the music. Movies and games bring together multiple artistic and not-artistic (with gasps from programmers who do view their work as art) disciplines. Lots of stuff is happening to get these things out the door and naturally involves particular interests from publishers with money that don't often think much of historical disciplines like archival work and preservation and focusing on things like not letting code and trade secrets loose on the world. Some like Richard Stallman hate this, but I'm sympathetic.
Incidentally, the movie industry has nearly as bad a problem with preservation such that people like Martin Scorsese have devoted much of the latter half of their lives to recovering old decaying films.
In the end, things are better now, but not perfect. Expect that a brand new art form that originally required big bucks and investors would have trouble with this sort of thing. Partly from technology issues, partly because people back then just didn't know any better and didn't have formal guiding theories for how dev should properly be done, and partly because the nature of the beast involved money which changes incentives pretty quick.