The Root of All Evil
by Dave Rickey
The moon be - longs to ev'ryone
The best things in life are free
A recurrent subject in discussions of online games is the risk-aversion of the major projects and companies, their unwillingness to go outside of proven formulas and try to bring totally new things into the market. This is both true and false: True, because with budgets having reached 8 figures, there's a limit to how much risk can be accepted by responsible project management; False, because online games have been much more innovative in recent years than single player games, and because the hold that established games maintain on the market makes simply retreading established formulas even more problematic and expensive than breaking new ground.
The fact is, there have been very few EverQuest clones to actually reach the market (about three), and even fewer that have been sucessful (exactly one). The near future will see a couple more, with budgets that exceed that of the original by a factor of 5. It's not hard to clone EQ, there's only one little problem: The original refuses to go away and turn loose those customers. In single player games, a clone (or sequel) can always hope to sell to some subset of the market that liked the original game, and wants a new episode with similar gameplay and setting. In some cases, the clones have reached beyond that, and eclipsed the progenitor as the defining game of their niche.
That can happen to an extent with online games: EQ was in the eyes of the market a very similar product to Ultima Online. After all, they were both fantasy-based OLRPG's with dragons, magic spells, and the other traditional D&D-inspired trappings; to an outsider the only significant differences are EQ's first-person viewpoint vs. UO's isometric overhead. But while UO established the MMO market, EQ defined it, growing to twice the size of UO before significant competition appeared. The most telling observation is that the first games to appear and seriously challenge EQ were both different in ways that even those outside of the established playerbase could appreciate: One (Anarchy Online) was a sci-fi setting, and the other (Dark Age of Camelot) had an emphasis on PvP. This established what I believe will be an invariant rule of MMOG's that stands in stark contrast to single-player game development: Differentiate, or don't bother (Anarcy Online's technical issues crippled it after launch, but its initial sales were quite good).
Single player "market builders" come out of nowhere, turn into huge hits, and then get out of the way; even their sequels are just clones with name recognition and a leg up. Online games don't get out of the way. Once a certain setting, assembly of gameplay, and basic technology gets a grip on a segment of the market, they will hold it until a failure to adapt allows someone else to chip away part or all of it. The only advantage the newcomers have is a clean slate, which they can only capitalize on by innovating. Nonetheless, how much innovation to embrace will remain the thorniest problem in MMO design for quite some time to come. A failure to innovate will guarantee market irrelevance and failure, while over-reaching and excessive ambition will result in death march projects that never quite manage to reach launch. And with budgets continuing to inflate, the stakes are high.
Offset against the big budgets are equally big potential payoffs, EverQuest is the highest-grossing computer game ever. The Sims franchise is #2, UO #3 (and Myst #4). Some would point out that UO and EQ have required considerably more to develop and operate then The Sims, but once you factor in marketing and development budgets for The Sims and over a dozen expansions the numbers are probably in the same ballpark.
One question I've seen asked in several different places is "How do you manage to spend so much money making these games?" In one word: Payroll. Building worlds is a labor-intensive process. You can license your engines, buy your hardware off the shelf, build your architecture around OSS components, but you can't create a new and interesting world and fill it with detail without a lot of artists and world builders. Algorithmic content generation can improve their productivity, but there's no substitute for hand craftmanship.
Here's a spreadsheet for a budget to make an MMO. It's slightly modified from the one I was using as part of pitch earlier this year. Around 90% of the projected budget is personnel related costs. Salaries, insurance, office furniture, PC's, it all adds up. By launch day, the burn rate is around half a million dollars a month (and at that, I shorted the Customer Service department, what is in the plan would only cover about 50,000 players). But if you start analyzing it, it becomes pretty clear that the expense comes in four phases:
Phase 1 is server prototyping and architecture development. In my plan, this is a *long* stage, close to two years. In the typical project, it's much shorter, if it exists at all. Usually, the client, server, and content are being developed on parallel tracks at the same time, with the ostensible purpose of minimizing development time. In fact, the real motivation comes from the milestone-driven structure of the normal development project. In such a structure, a "playable demo" usually comes pretty early. This seems perfectly logical from the viewpoint of those putting up the money: they want to see something produced for all the money they are putting in. Unstated, but usually implicit in getting milestones signed off and additional funds invested is that the playable demo also be reasonably pretty, in fact it's more important for it to be pretty than for it to really be playable.
There are three problems with this: Once you start producing art and building the world, you have to commit to a technological target, your models, textures, and the graphics engine are going to have to be aimed at a certain level of graphics capability, so the clock is ticking, miss your tech window by too much and your game is going to look out of date; And once you start hiring artists your burn rate goes through the roof, which means that the money is going to run out a lot sooner; And the area of development with the most potentially serious unpredictable delays is server development, if you hit a roadblock there and pushing off the launch date is not an option (and if your payroll is fully staffed up, it won't be), your only choice is to start cutting features (and because world-building and tools development may wind up delayed in starting by server snafus, this can lead to ripple effects that can send the whole project into a tailspin).
This is why I would strongly suggest that the industry as a whole needs to take a page from NCSoft and stop trying to make things pretty so early in the development cycle. Develop your server, develop your tools, do all of this early on in the process when your costs are comparatively low and you can afford to do it right.
Phase 2 of the process is where most people have been starting. The client programmers and the core of the art team is brought on board, and something that actually looks like a game starts to get built. This gets integrated with the server (in the traditional process, the server is still being built at fundamental levels and frequently this need to support and integrate with the client is driving the priorities there). After this is nearly complete, tools are developed for content development (again, in the normal process tools are lashed together to meet the needs of the playable demo, then minimally improved for use in the production pipeline because the press of time demands it).
Phase 3 is where most of the money gets spent. Lots of artists and world builders are brought on board, the gameplay features are finalized and grossly balanced (fine tuning balance is a never-ending process that will extend well past launch). At this point you are committed to a launch window only a few months wide.
Phase 4 is the final run up to launch. Servers are purchased and configured, betas are run, bugs in all content and systems are discovered and fixed. For the typical project, this stage never really happens, but is instead overlaid on Phase 3, with the result that in the crush to finish the game only the most severe bugs get fixed, if you're lucky.
For the first 2 years of this hypothetical project, expenses are comparatively low, but there's comparatively little to show to those without the background to evaluate strictly technical progress. Companies and investors that are used to the usual single-player development cycle can either accept this and adjust their plans, or they can continue to try to build MMOG's the way that single player games are built, and they can continue to fail.