Remedy is closely associated with Futuremark (formerly MadOnion), a company that introduced the ubiquitous 3DMark comprehensive benchmarking application series.
Remedy is currently working on the imaginatively titled Max Payne 2, the sequel to Max Payne.
I've interviewed Markus before but since we hadn't been in touch for a while I thought that it would be a good time to catch up with him on some general questions about game development and 3D graphics, topics he is intimately familiar with since he in involved with both Remedy (making games) and Futuremark (benchmarking).
Markus, could you please tell our readers your position and role at Remedy? What about your past and present relationship with Futuremark (formerly MadOnion)? You founded both companies, didn't you? What is the relationship company-wise between Remedy and Futuremark?
I’m the project lead for “Max Payne 2”, For Rockstar Games, to be released in 2003, working closely together with Petri Järvilehto, Max1’s project lead and lead game designer for Max Payne 2. Basically, he’s more creative and I’m more technical and organized, so it works out really well.
I was involved with Futuremark for a long time, from 1997 to the end of 2001, being in charge of benchmark development, especially 3DMark. 3DMark 99, 99 Max & 2000 were my projects. Patric Ojala (the current project lead for 3DMark series) started with 2001, although I was also quite heavily involved with it.
Back in the old days Futuremark also did some demos for the IHVs, so I’ve been the lead on a few of those as well.
The relationship between companies is such that Remedy is a shareholder in Futuremark, and although 3DMark03 doesn’t use Remedy’s 3D technology anymore, some small level of technology sharing is still going on. Located side by side, we also share some office resources (they have a Pepsi vending machine and we have a Coke vending machine…)
You started out by making games (with Remedy) and then progressed to benchmarking (with MadOnion) and you have now returned to making games (something "close to your heart" as you told me before). What are the things you learned, while working exclusively at MadOnion producing all the past 3DMark series, that can be put to good use in Remedy's upcoming games?
One thing I definitely learned a lot about was shipping titles. All in all I was directly responsible for seven shipped titles, and heavily involved with three others, plus had the opportunity of watching from the side both Death Rally and Max Payne develop and ship. Naturally the projects, such as 3DMark2000, were not quite as massive as a triple-A game project, but still a lot of the issues on getting them from plan to prototype to alpha, and onwards to be polished and tested products needed to be taken care of.
And 3DMark especially taught me the importance of proper quality assurance for PC games.
When it comes to making a game, could you briefly outline to us the steps involved from start to finish and how time pressures and potential publisher pressures influence the entire process. Evidently, some compromises have to be made - what area(s) of game development suffers the most in order to have a well-made, good-looking game appear within a given deadline?
The really really short development outline is something like:
1) Concept
2) Prototype
3) Deal with a publisher (hopefully)
4) Pre-production
5) Production
6) Polish & QA
All of course depends on the project. For the first title on an original IP, the process looks quite different than for a sequel or a title based on licensed IP.
Developing “without schedules”, first of all, is something that few developers can afford, and I’d say outright stupid. Now whether something is “When it’s done” externally is another matter altogether. That way you have the freedom to go in and make things better without having committed to an external (or even worse, public) schedule.
When you have external schedules, your ability to freely experiment and change your decisions a few months down the line becomes hard. An excellent pre-production is then a necessity (well, it’s important anyway). When you’re trying to do something new and revolutionary, setting tight schedules for it early on in the project is I’d say impossible.
Additionally the 2-3 last months are the ones when most of the polish goes in to the game. If you have some flexibility in the end of the project, it will really show in the end result.
What are the current Remedy workstations or development platform specifications? How often does Remedy change them and/or which are the usual system components changes? Why?
We usually upgrade the systems always when it makes sense and often the upgrades are partial (faster CPU, more memory, more HDD, new graphics card).
Majority of the workstations right now are Intel Pentium 4-based, ranging from 2 GHz to 2.8GHz, with a 786MB-1GB of memory. We have some Athlon XPs as well but those are in the minority.
Graphics cards are either Radeon 9700 Pro’s or GeForce4 Ti4600’s. It’s something of a personal choice over here. Usually people who use 3DSMax want the GeForce and programmers want Radeons. If someone has an older card, it’s because his system works well and he doesn’t need a faster one. Everybody runs Windows XP, except for a few older systems that haven’t been upgraded in a while.
How do you determine the minimum required system specifications for a game that Remedy makes? At what stage of the entire development process is such a minimum system requirement consideration arrived at (at the very start of design where you have already a targeted system and of which game design limitations are set, during the development itself where discoveries are made or right at the end)?
It’s not an extremely straightforward or easy decision. It depends on what your existing technology is, what is the project duration and timeframe for the planned game, what platforms it is targeted for etc. Consoles place a heavy burden on the content scalability. Unless you want to sell a million copies less, your game better be somehow portable to 32MB of RAM on the PS2…
Still, on PCs, I think the three-year technology rule of thumb is pretty valid. I.e. if your game will be targeted to be released in three years, your minimum spec can be the best hardware out right now. But at least with us, both the minimum and maximum specs evolve somewhat during development.
I also have access to stuff like Futuremark’s database reports and can see for example how fast the average CPU MHz and average amount memory develops, and what are the 40 most popular graphics chipsets plus their “market share” in 3DMark users. This makes a more scientific decision easier. I think for example Valve uses the same kind of research methods.
Having been involved in making "proper" benchmarking applications (the past 3DMark series) as well as games (Max Payne), what are the considerations and differences between a benchmarking application and a game with benchmarking feature?
I’ve written commentaries in a few white papers on this issue during the years. I think it boils down to one key word; experience. Experience to ensure the application does an apples-to-apples comparison. Experience to make the right, justified judgment calls on what technology to use, or when an IHV puts on the pressure, integrity to stand behind what you think is right.
With benchmarks like 3DMark, you can take some freedoms you don’t have with games. The projects are smaller; the amount of content you need to create isn’t tremendous (compared to games with 10 hours of gameplay), and you don’t have as much pressure to ensure your graphics scales from consoles to high-end graphics cards. In fact, the content shouldn’t scale, only the results should.
I also think games can be “proper” benchmarking applications. Some games are great benchmarks - I think for example Epic and id are doing great work on that area. Both companies have strong technical leads, Tim Sweeney and John Carmack, whom the whole game industry respects, and the gaming community trusts.
As a hardware site that focusses on video and 3D technology, Beyond3D is constantly looking for applications (games or otherwise) that feature some sort of benchmarking facility. The continuing problem of the small number of games that has proper benchmarking features is frustrating and more or less forces the use of synthetic benchmarking applications. Is there any other reason why developers do not include proper benchmarking features into their games other than a simple-and-obvious "Benchmarking is not a game design priority".
It’s definitely not a priority. Ensuring the benchmark is reliable and good is time away from making the game play better. Would you not rather have for example more eye-candy than a benchmark mode?
Of course developers have (or should have) internal profiling tools; Max Payne had extensive profilers built in. This is a must during game development, but it’s still quite a lot of effort to turn that profiling into a reliable, easy to use apples-to-apples benchmark comparison.
Do you think that it is acceptable that hardware reviews have resorted to using "game hacks" (like those done by a website to benchmark Max Payne) and third-part utilities (like the well-known FRAPS) as additional means to provide extra information in hardware reviews?
Having never used FRAPS myself, it’s hard to comment on how that particular tool works, but I don’t think that any of these methods would change the claims I made in my answers to your previous questions.