5/26/99
In layman's terms, what's the biggest challenge in making AI opponents in a strategy game "act real"? To what extent can a game (or AI opponent) 'learn' from your behaviors and tendencies and make adjustments of its own?
The biggest challenge to "acting real" is that real people are very smart. I prefer to tackle the problem of "acting fun". Fun is difficult to quantify so we just play the game a lot and try to eliminate the behaviors that are not fun (units getting stuck, not following you, sitting still for no good reason). Another tactic I try to use to achieve fun is to give units lots of simple behaviors that get combined in different ways. When a bunch of units get together in a battle the combined effect is much more interesting than the sum of the parts
The learning algorithms that I am aware of try to look for the settings for a couple variables that optimize some outcome. Like, what should my throttle and steering values be such that I turn this corner without falling too far off the path? Or, what percentage of income should I dedicate to defensive units such that my base doesn't get destroyed in the first 5 minutes of the game? There are a million such questions which can be posed in terms of learning, but searching for solutions takes lots of data and lots of time: much more than a few games against a human opponent would provide. I think its better to search for these settings during the game development process and put the results directly into the code or into configuration files.
What changes/enhancements are being made to BZII's AI and unit behaviors? What's the next big thing for AI in games?
In BZII, pathing stays farther away from cliffs so units are less likely to get stuck going around corners. Buildings that get built or destroyed during a game are avoided at the pathing level instead of the collision avoidance level so units are much better at navigating bases. George has made fighting much more interesting with more attack modes, automatic weapon reload and repair, and tactics based on skill level.
I don't know about the next big thing, but I think we will see a lot more along the lines of group tactics becoming much more elaborate. Along with that we will see a lot more of the AI in games giving you more visual and auditory clues as to what it's thinking. So even if it is not any more intelligent it will appear more intelligent because you can see or hear what it's thinking.
Why do units in strategy games sometimes 'get stuck' or run into things they're not supposed to? Are there some units/objects that are more problematic in this way than others? If yes, which ones and why?
Units can get stuck for a many reasons. They might plan a path through a building because the building didn't get added to the pathing database. They might get stuck between two vehicles because the collision avoidance code thinks there is room between them when there really isn't or the collision avoidance code doesn't know how to plan around more than one obstacle at a time. A big unit might get stuck going around a corner because the path it is trying to follow is too close to a wall. A slow unit might get stuck going up a hill because the path following code doesn't set the throttle high enough to get up the hill.
Big vehicles are the biggest problem. Path planning is done with line segments. I try to keep these segments from being too close to walls but sometimes vehicles are too big to follow the path without getting stuck on the walls. Another problem is when units can't turn in one place. If a unit is facing the wrong direction to follow the path it is given it must be able to turn towards the desired direction without straying too far from the path.
Is it really that much harder to give units airplane flight type movement than to just have hovering/floating units?
It's not to hard to give units flying physics and pathing. But it makes strategy more difficult because it is harder to build squads that can match up correctly to the air/land composition of the enemy. It is also harder to attack in 3D. We didn't have to deal with height differential much in BZ1.
What was the toughest programming challenge in the original Battlezone?
Path following and collision avoidance are the biggest challenge in bz1, bz2, .. bzN. This is a tough problem, and people need it to work to play the game well so it is quite obvious when it doesn't work.
What game (other than original Battlezone) do you feel had really well developed AI?
I only have time to play a few games a year, not exactly a thorough survey of the state of the art. I am impressed that AI can drive the cars in Gran Turismo so well. I like the text parsers in the Infocom games that can understand sentences like "pick up all except bomb", and "read it" where "it" was referred to in an earlier sentence. I like that StarCraft can send appropriate forces for an attack even though the races are fairly different.