Thursday, February 28, 2008

Building a Better Battle : Halo 3 AI

This presentation was given by Damian Isla from Bungie Studios.  It presents the basic architecture and tools used to build the AI system in Halo 3.  This lecture specifically detailed the encounter logic.

Encounters are the "dance".  Basically how the system reacts and collapses in interesting ways.  The "dance" is the illusion of strategic intelligence.  Designers choreograph the "dance" to be interesting and drive the pacing of the story, kinda like a football coach directs his subjects.

Halo 3 uses a 2 stage fallback.  Enemies start off occupying a territory.  The aggressor (player) then pushes them back to fallback point.  After this they are pushed to the last stand location, after which the player will "break" them and finish the battle.  "Spice" is added on top of this by designers to make the encounter play out in a more realistic fashion.

Mission Designers handle the encounter tasks with the AI Engineers handling the squad (how the AI behaves autonomously).

Halo 2 used the Imperative Method to control AI (Finite State Machine).  The designers were given access to dictate what happen as various events were triggered (ie. enemy starts losing battle).  The primary problem with this model was the need for explicit transitions (n^2 complexity).

Halo 3 took a different approach, using the Declarative Method.  This basically works by defining the end result you are looking for (with reference to AI).  You enumerate the "tasks" that are available and let the system make the decision on how to perform these tasks.

One of cool things about using the declarative method is the ability to set relative priorities.  Example would be guard the door but if you can't do this, then hallway.  Also, it brings the notion of hierarchal tasks (sub-tasks).  Example would be guarding the hallway means guarding both ends and middle.

Funny comment that Halo 3 AI works like a Plinko machine.  This means pour tasks into the system, prioritize them, and then pour enemies in and let the system place them and control their behavior.  This also means it's an upside down Plinko machine ;)  This is because tasks can be activated/deactivated at will and cause the enemies performing them to re-evaluate the situation and "do something else".

The system uses a proprietary scripting language named HaloScript.  This allows designers, who are not programmers, to design and use the system.

Thursday, February 21, 2008

Life on the Bungie Farm: Fun Things to Do with 180 servers and 350 processors

This lecture was given by Luis Villegas and Sean Shypula. This was primarily about the server farm and distributed computed system created by Bungie for automated builds of code and content.

Advantages:
  • Faster iterations -> more polished games
  • Keeps complexity under control

Binary Builds (game and tools)

  • Automated tests are run on tool builds only

Lightmap Rendering

  • Pre-Compute Lighting in scenes (Photon Mapping and custom algoritms from Hao and crew)
  • Bakes the level files (output)

Content Builds

  • Compiles assets into monolythic files

Website (bungie.net) Builds

Patches (maintenance items for servers)

Halo 1 -> All assets processed by hand, very few automated tasks

Halo 2 -> More automation (3 servers in farm -> one for each function)

Halo 3 -> Unified systems into single extensible system

The latest iteration, created with Halo 3, did a few new things (rewrite).

  • Unified codebases, implemented single cluster.
  • One farm
  • Updated code to .net (C#), easier to develop/maintain

Stats

  • Over 11,000 builds (exe/dll)
  • Over 9, 000 lightmap builds
  • Over 28,000 other types of builds
  • Halo 3 would not have shipped in current form without the farm.

Interface for users (developers)

  • Had to be easy, simple with "one-button" submit operation
  • Even if users are developers they still don't want to know what is going on behind the scenes

Architecture

  • Single system/multiple workflows
  • Plug in based
  • Workflows divided into client / server plugins (isolation from each other)
  • Server schedules jobs (messages clients)
  • Client start jobs and sent status and results back to server
  • Server manages state of jobs
  • All communications via SQL Server
  • Incremental builds be default
  • Between continuous integration and scheduled (devs run builds ad-hoc and there is a scheduled nightly build)

Symbol Server used (Debugging Tools For Windows)

  • Symbols registered on server

Source Stamping

  • Linker setting for source location
  • Set at compile time
  • Engineers can attach to any client from any client as long as they have Visual Studio installed.

Lightmapper was written specifically for the farm

  • Chunks job parts to clients
  • Merges results

Simple SLB

  • Min / Max configurable
  • More clients used to support workload if clients are mostly idle

Cubemap farms

  • Used XBoxes and PCs for rendering and assembly.
  • Pools of Xbox Dev Kits
  • No client code on Xbox
  • Few changes for Xbox Support

Implementation Details

  • All C# (.Net)
  • Object serialized to XML to start but switch to binary serialization later (speed and mem benefits)
  • Downsides (memory bottlenecks, forced GCs, should have been more careful with memory)

Beyond Printf: Debugging Graphics Through Tools

This lecture was given by Dave Aronson from Nvidia and Karen Stevens from Microsoft (graphics tools division).

Main toolsets

  • Windows - PIX
  • Nvidia - FXTools

Use GPU & driver counters for basic performance related issues.

Shader Perf 2.0

  • Test opt opportunity
  • Integrated to FX Composer
  • Regression analysis
  • Its in Beta 2.0!
  • SDK Available

FX Composer for authoring and debugging shaders.

Use PIX for:

  • Game Assets (textures, shaders, vertex buffers, index buffers, etc)
  • API (DirectX)

Use Nvidia tools for:

  • Driver related items
  • Hardware specific items

Link dump:

http://developer.nvidia.com/PerfKit

http://developer.nvidia.com/PerfHud

http://developer.nvidia.com/ShaderPerf

http://developer.nvidia.com/FXComposer

GPU Optimization with the Latest NVIDIA Performance Tools

Lecture given by Jeffrey Kiel from Nvidia with an instructor from Full Sail giving a demo.

Optimization Techniques
  • System -> CPU to GPU and multithreading
  • Application -> game code
  • Microcode -> lowest level (tied very closely to hardware)

Optimizations should start at the higher level and work way down. Microcode optimizations can be benefital but will only work with certain configurations (good for consoles, not for PCs).

If GPU is > 90% utilization, should look for a GPU bottleneck first, if lower then CPU (app code, driver, etc) should be looked at.

PerfHud (tool from Nvidia).

  • In version 6 of this tool, no special driver is needed (retail drivers have instrumenation hooks already in place).
  • SLI Optimizations can be discovered with this tool
  • API Call Data Mining (both in tool and export to own data analysis offline)
  • Shader Visualization / Texture Visualization
  • Hot key mapping to trigger user defined options (for debugging).

The demo shown was Marble Blast Ultra (pc version). There were optimizations shown.

Next demo was with Crysis.

  • Programmers should start putting PerfMarkers in their code now. These will help later.
  • API Time Graph is a new feature in beta
  • Perf hints (single and SLI) are given
  • Subtotals in Frame Profiler
  • Break (_int3) on draw calls (new feature)
  • Support for 32bit apps on 64bit OS (was not previously supported natively by driver)

There were some OpenGL tools discussed.

Environment Design in HALO 3

Lecture was given by Mike Zak from Bungie.

Typically there are artists, who manage geometry, and designers, who manage gameplay. Bungie has titled an architect as the "glue" between these two. This is the role Mike fills at Bungie (as well as a finishing artist).

Pre-production model
  • Broad Timeline of level and game
  • Napkin sketch
  • Concept Art
  • Whiteboard

Aspect 1 : The Hook

  • Promenant features in the rooms
  • Don't create mazes for player to get lost in
  • Force orientation without using force :)
  • Should obviously be easy to grasp and navigate
  • Suggest a tactic by geometry, lighting, etc (no force)

Aspect 2 : Scale

  • How large should the level "feel"
  • Engagement ideas/planning

Aspect 3 : Combat Elements

  • Fronts
  • Layers
  • AI Blinds

Aspect 4 : Movement Elements

  • Player Shortcuts (make the player feel like he discovers ;))
  • One-way paths
  • Ninja paths
  • Vehicle flows

Mike then shared some concept art, geometry and finishing art with us. Link to presentation will be posted next week.

Audio Post-Mortem: HALO 3

This lecture was given by pretty much the core audio team at Bungie (Marty O'Donnell, Jay Weinland, C Paul Johnson, and Mike Salvatori.

First off let me just say that in my opinion, Marty O'Donnell is one of the best composer/audio directors I have ever seen. He, and his team are very passionate about audio. It really inspires others to push as hard as they do for audio perfection.

Jay Weinland started the lecture by talking about how things changed from H2 to H3. Basically, the fundamental issue between the Xbox Gen1 and Xbox 360 was that the DSP audio chip was removed from the 360. Also, with Gen1, the HDD was guaranteed to be there, but with Xbox 360, HDD is optional. So the system has to work with both configurations. With HDD, its easier, stream everything from HDD and use a small amount of system mem (with Gen1, the audio chip was used, so literally a few MB of system mem was used). This basically meant that more programming was required to handle the audio on the 360. Matt Noguchi and help from Microsoft were employed with great success.
C Paul Johnson then spoke about LOD and how audio levels were set based on distance, which was new with the 360. Also, the algoritm developed took into account whether a HDD was attached or not, and audio is culled if its missing for a "degraded" but functional system. He also demonstrated how the looping works (to save space but randomize so it does not get boring for the user). This was talking about ambient sound primarily.
Marty then spoke about how he composes and tools used to do his magic. He demonstrated how he puts various clips together with in house created tools (Guerilla) and other tools.

E Pluribus Unum: Matchmaking in HALO 3

Chris Butcher from Bungie gave this presentation. This was talking about how matchmaking works in H2/H3 and differences (goods and bads) of both.
This was a pretty technical talk about the internals of the matchmaking process. Chris started by talking about why this is needed. The idea of user being matched in games that will be fun (not too easy or hard) with some way to hold the user and keep them playing.
The problem with H2 matchmaking was that is was based heavily on skill. The cheaters who had figured out ways to break the system created bad experiences for typical users.
With H3, the team worked with Microsoft (Research) to use the TrueSkill and integrate this into their matchmaking to make this process seamless and transparent to users. This is a pretty complex system based on a bayasien algorithm to compute the players real skill. This combined with some somewhat complex networking code pulls off the matchmaking we are now using with H3. Chris presented results with > 80% of all Halo users playing over 100 multiplayer games. This is a pretty substantial improvement over H2.
Problems with the new system are the complexity can confuse users (as they don't get to see the under the cover working going on, nor would most understand it).
I will post the slides when available (Feb 25th).

The profile of a Great Software Tester

This was hosted by Anibal Sousa, a Microsoft XNA manager (and Lead). He is responsible for testing XBox Live and components contained therein. I am trying to get an electronic copy of the presentation he showed.
Anibal has a very good type of analysis used to determine if a person fits as a good software tester. The attachment I will post will show this better. I will post an update when I get this.

Technical Issues in Tools Development (Roundtable Discussion)

John Walker hosted this session.

Topics:
  • Using reflection to help build GUI for tools
  • 3rd Party tools / components used to build tools
  • C# and managed code for tool building

Reflection with C++ and C is a bit harder to pull off (use header files) than something like C# or Java. This limits this approach for dynamic generation of gui based tools and plugins for IDEs.

Some of the studios still using C++ or C for tools development were commenting that technical artists hate (yes I said hate) most of the grids and property setters exposed by tools. Most of the time this is because artists (creative type) are looking for color pickers and such (nice gui's) and developers are more analytical (just numeric values will do). The tools typically created by tools developers fit better with developers.

Possible ways to fix this were discussed. One would be to adopt something like MVC pattern and expose different views based on user.

With regards to 3rd party tools, artists again are looking for more WYSIWYG interfaces, instead of the lower level interfaces they are given. The idea of creating debug versions of data structures as well as executables came up. The problem here is when problems only show in the release build (data structures) and are ok in debug build.

Bug databases are crucial and automation to fill these without intervention (or light intervention) from artists are an absolute requirement. Artists typically will just quit using or find workarounds if given broken tools.

If using managed code, Sony Online mentioned they were using Cruise Control .Net but does not have it fully working yet (automate builds done but not functional tests).

Many studios are adapting Agile software development tactics (like daily calls about projects / issues).

Automated logging came up, with log4net pushed as a very good tool.

Maybe 60% of the studios in attendance (including Sony Online, Bioware, Pandemic, Microsoft Games Studios, 2K, EA) are currently using C# and managed code to build tool sets (at some level).

The ones still using C or C++ were doing so for richer data when crashes happened. Also, these studios were happy with what C# could do, but typically they have to use what is in place, with not alot of time to rewrite everything with a new language.

The idea of standardizing on languages (scripting, tools, engine) was discussed. John pointed out that its almost obsurd to try this. Developers use what they are familar with to get the job done. Whether is Ruby, Java, .Net, C, C++, Lua, Perl or whatever else. The one key point was not to be writing your own scripting engines, which is where some studios went. This becomes more of a computer science project than a functional piece of software. Standardization on a variety of languages (common used, non-proprietary) is required. Key point, make all tools dump data in common format (XML)! Embrace the chaos, don't try to eliminate!

Technical Consulting Engineer, Intel C++ Compilers (Sponsored by Intel)

The speaker was a no show, so nothing here. :(

Wednesday, February 20, 2008

Running Halo3 without a Hard Drive:Presentation by Matt Noguchi


  • Current Next Gen Games are IO bound.

  • DVD drive supports about 12MB/s transfer rate

The key item impressed by Matt was to minimize seeks. When using hdd, the problem is much less pronounced, but with systems that might not have hdd, the system must work with at least just DVD drive.


Break the levels down to required and optional resources.

  • Required resources -> blocking to load/synchronous

  • Optional resources -> non-blocking to load / asynchronous

Sound assets are huge issue (level 5 in H3 has 566MB of sound assets alone)


  • Cannot stream sound from DVD (only HDD)

  • Solution is to cut out AI dialogue when on DVD only (limited experience)

  • With HDD, stream everything (speed much less issue)

Break up levels into zone sets with transitional volumes created by designers. Trigger volumes will (pre-cache) assets required as need (and evict non-required ones). Bungie had to limit objects used in each zone set (to keep under mem limit of ~334.8MB).


Next problems, thread contention (game thread/render thread). Solution was to provide a further abstraction from resources as the streaming IO is loading/evicting resources (use another cache).


Last optimization was layout of items on disk. Global or shared items (used in most levels) should be written to DVD at same location to allow sequential IO (much faster). All games assets in H3 totalled 15GB, but after optimization and culling were under 7GB.