Here are some project ideas. Each project title is color coded to indicate an assessement of how technically challenging it is. Where green indicates no significant technical risk. But as a result, to gain high marks, the project should be more substantial and professional than riskier alternatives. Orange indicates moderate technical risk and red indicates a project that potentially contains more research elements. To obtain high marks on the most risky projects, you still need to produce something that works and is significant, but you will only be expected to perform the first small steps.
As a game, the tag game is pretty dull. Add AI elements to make the game more fun. While you may want to add some additional game play elements, your focus should be on the AI. For example, adding better pursuit and evasion behavior, or an emotional model for when the NPCs are mad about being tagged and want revenge, are both legitimate enhancements. Adding fancy graphics just for the sake of it will will not earn you marks. However, if the additional graphics have some impact on the characters that result in interesting behavior from an AI perspective, then that will earn credit. For example, just adding exploding obstacles will not get you additional marks, but if the obstacles flash first (say) and the NPCs therefore grow more fearful of them, then that will be suitably rewarded.
You may also want to seriously consider fixing the collision detection and response. Both for the circular objects and with the sides of the world.
Game developers often like to make their AI scriptable in some interpretted language. This allows them to experiment with new behavior on-the-fly and potentially makes tweaking the AI accessible to level designers and other non-programmers. So the goal of this project is to embed and extend an interpretted language in the tag game and then be able to use it to add and change behavior on-the-fly while the game is running.
Because it is so lightweight, Lua is a popular interpretted language used for AI scripting in the games industry. But you are free to use any other suitable scripting language, such as Python or Scheme.
Add some simple animations to the tag game and make some AI controllers that take account of which animation is currently playing and which animations could be used next. Depending on how sophisticated you want to make the project, the code at the Geometric Tools website might help you deal with animating 3D models.
Implement a collection of steering behaviors. You will need to greatly expand the map to show off all the behaviors. To get high marks you will need to make the implmentation efficient and be able to show off a large number of boids on screen at once. The code at OpenSteer project should help you a lot.
Implement a finite-state machine class (FSM) and use it to implement some simple controllers in the tag game. You might want to consider making a GUI for building the FSMs.
The goal of this project is to write a controller for the NPCs that uses path planning. To demonstrate your path planner in the tag game you should make the game world more maze like. You will also need to discretize the world. Either by using a grid, way-points or a navigation mesh.
There is lots of path planning code available on the internet that can help you, but one of the most widely used in the games industry is the code associated with the article "Simplified 3D Movement and Pathfinding Using Navigation Meshes" by Greg Snook that appeared in the book Game Programming Gems I.
Adding waypoints to a map by hand can be tedious. Implement a way of creating them automatically. A simple approach is to have an NPC run around randomly, intermitently laying down points at their current location. There is usually then a post processing step to remove redundant points. You can get additional marks for supplying an editor to tweak the waypoints, or for implementing a more sophisticated approach in the first place.
Add a commentator to the tag game. The commentator should make poignant and relevant remarks as the game unfolds. You are welcome to use canned speech or a text-to-speech engine.
If you want to take the project even further consider allowing the NPCs to appear to "talk" to each other using the same trick Will Wright used for his Stupid Fun Club project.
It is often said that one of the hardest pieces of AI code to write for a game is the camera controller. The first part of this project would be to create a 3D version of the tag game. You would then write a camera controller that "did the right thing" depending on what was happening in the game. You could also provide various different user selectable camera controls, such as a first person view, an overhead view, an "intelligent" camera view, etc.
Building on the introductory assignment, implement particle filtering in the tag game along the lines described in this paper. Also take a look at the animation that accompanies the paper to see how you might consider visualizing the result.
Incorporate a speech recognition engine (e.g. the CMU Sphinx code) into the tag game so that you can control characters with voice commands. Consider making the voice commands context-sensitive.
Try to learn a controller rather than coding one. You can try learning one from scratch using reinforcement learning, or from data using supervised learning (the Weka data-mining software may help you).
Ideally, the resulting controller should be able to demonstrate some ability to generalize to unseen cases.
Try to create a controller for an NPC that can learn while the game is playing.
You are welcome to "fake it", by just giving the appearence of learning online. But you will obviously then be expected to produce more impressive in-game behavior.
If you don't fake it, you can just begin by using simple statistics you gather to alter behavior. Beyond that, see how far you can get. But the learning must be fast and noticeable in real-time.
Last edited Thu Jul 13 22:14:46 2006 by John Funge.