The Relentless Pace of Technology - Game Developing

Game making is a creative art form that competes with other media such as novels, television, movies, and music. While technology has had dramatic effects on how music is recorded, how film is taped, how television is delivered, and even how a novel is typed, none of these other art forms have to compete with technology to nearly the pace game making does.

Movies are probably the closest art form in scope, cost, and high-level production methods. That being said, camera technology stays stable for 20 years at a stretch, lights are lights, and microphones are microphones. Right now the movie industry is looking at using digital film, but again, this is technology that has been in regular use for 20 or more years.

In the past 25 years that electronic games have been a consumer entertainment medium, they have gone through nearly countless technological evolutions including text adventures, 2D graphic adventures, turn-based strategy games, 3D action games, smooth-scrolling 3D action games, ray casting engines, binary spacepartition engines, and I could go on and on listing the different game engines that have been created.

Each new game must develop its own tools first and then create its content. Future add-on and expansion packs will often use the same engine, and in some cases the sequel will use a modified version of the prior game. It has become increasingly common in the last five years to license whole game engines such as Quake and Unreal to act as the foundation engine to build a game. A game requires not only a solid design but also a completed engine and tool path prior to entering the implementation or production phase; otherwise the inevitable result seems to be redoing work over and over, which is demoralizing, expensive, and a waste of time.

This shifting engine technology is not seen in any other consumer software product. There is no consumer operating system, word processor, or spreadsheet that has required the computing power of the last five or ten years of Intel’s advances to the x86 line of chips. It is games that drive our voracious appetites for more RAM to hold our textures, gigabytes of hard drive space to hold our gigabyte installs, and the fastest CPU on the planet to simulate our fantasy worlds.

The dark side of this technological advance on the PC side of the game business is that we do not know what hardware the consumers will have before they install and run our software. We do not know if they have 64 MB of RAM, 128 MB, or just 32 MB of main memory. We do not know if they have a 3D accelerator card with 8 MB of RAM, 32, 64, or no 3D card at all! We do not know if they will have enough space to install our game in its full glory, so we have multiple install options. We do not know if their graphics card chipset will support the subset of features we want for our game. We do not even know how fast the target CPU is. In fact we do not even know what operating system they will be running our game on. Sure it will be a Windows variant, but there must be big differences between Windows 95, Windows 98, Windows NT, Windows 2000,Windows ME, and Windows XP or Microsoft would not have put thousands of man-years into these operating systems. These operating systems have major differences on critical lowlevel functionality like how memory is accessed and protected, how timers are created, what their resolution is, and the efficiency of storing and retrieving data from the hard drive. There are people out there playing Starfleet Command 1 with the graphics options turned low on laptops with only a Pentium 90 MHz and no 3D card, and there are also folks out there with a Pentium IV 1.7 GHz with a GeForce 3 card that has 64 MB of memory just on the card. Depending on which metric you use, the Pentium IV 1.7 GHz is nearly twenty times more powerful than the Pentium 90. This is called Moore’s Law, stating that the computing power of computers doubles every 18 months.

With all of these unknowns, we need to create a game that will run substantially well and deliver the same play experience on the greatest number of machines out there. This is where minimum requirements and clever use of scalability in performance-intensive features such as graphics and artificial intelligence comes to play. Hardcore games typically have the most aggressive schedule for culling older machines from the minimum requirements. This, however, cuts into sales for mass-market games, and a delicate balance exists between pushing the edge of the performance bar in order to gain exposure and adoption by the hardcore players, and planning for broad sales by supporting as many older systems as possible. Games that are strong examples of this are The Sims, Star- Craft, and Baldur’s Gate I and II, which work on quite low-end systems. Much of their success in the mass market may relate to the fact that people with lower end systems can still play them. The final challenge in the fast pace of technological change is that your requirements will often change midproject or very late in your project. With less than six weeks to go on Starfleet Command 1, I was informed that Interplay signed a ten-product agreement to support AMD’s 3DNow chip set. With little time left before code freeze, we were forced to optimize just a handful of low-level vector and matrix routines to take advantage of the 3DNow feature set.

The console market is considerably different. When you make a game for the PlayStation 2 you know exactly how fast it will be, how much video RAM you will have, and every other detail of the console at the time of producing the game. (Except when a developer is working on a game for a console that has not been released yet to the public. In the case of Taldren, we are working on an Xbox game, and I get packages from Microsoft every so often with a revision to the software running the box. At larger intervals the hardware itself changes.) This factor, combined with much more stringent QA from the console manufacturers themselves, makes console games practically bug-free in comparison to PC games.

Console developers have a strategic advantage in that their platform is known and immutable, but also a disadvantage in that their platform may be supplanted by new consoles such as the recently released GameCube/Xbox, which technologically are far superior to the PS2. The console developers must then go through an awkward stage of trying to prove to the publishers that they are capable of developing on the new console platform.

The only way to deal with these technological changes is to plan for them. You need to build profiling and diagnostic tools straight into your game so that you can understand how it is performing under various game conditions. You need to allow time in your schedule to support the odd piece of software or hardware that is strategically important to your publisher. You will also need to develop your minimum requirements as early in your schedule as possible. The sooner you set the goal of meeting a specific minimum requirement, the closer you will be to actually achieving that goal.


All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status

Game Developing Topics