Good looks, the video-games industry is discovering, will get you only so far. The graphics on a modern game may far outstrip the pixellated blobs of the 1980s, but there is more to a good game than eye candy. Photo-realistic graphics make the lack of authenticity of other aspects of gameplay more apparent. It is not enough for game characters to look better—their behavior. must also be more sophisticated, say researchers working at the interface between gaming and artificial intelligence(AI). Today's games may look better, but the gameplay is 'basically the same' as it was a few years ago, says Michael Mateas, the founder of the Experimental Game Lab at the Georgia Institute of Technology. AI, he suggests, offers an 'untapped frontier' of new possibilities. 'We are topping out on the graphics, so what's going to be the next thing that improves gameplay?' asks John Laird, director of the A1 lab at the University of Michigan. Improved Al is a big part of the answer, he says. Those in the industry agree. The high-definition graphics possible on next-generation games consoles, such as Microsoft's Xbox 360, are raising expectatious across the board, says Neff Young of Electronic Arts, the world's biggest games publisher. 'You have to have high-resolution models, which requires high-resolution animation', he says', so now I expect high-resolution behavior'. Representatives from industry and academia will converge in Marina del Rey, California, later this month for the second annual Artificial Intelligence and Interactive Digital Entertainment(AIIDE) conference. The aim, says Dr. Laird, who will chair the event, is to Increase the traffic of people and ideas between the two spheres. 'Games have been very important to AI through the years', he notes. Alan Turing, one of the pioneers of computing in the 1940s, wrote a simple chess-playing program before there were any computers to run it on; he also proposed the Turing test, a question-and-answer game that is a yardstick for machine intelligence. Even so, AI research and video games existed in separate worlds until recently. The Al techniques used in games were very simplistic from an academic perspective, says Dr. Mateas, while Al researchers were, in turn, clueless about modern games. But, he says, 'both sides are learning, and are now much closer'. Consider, for example, the software that controls an enemy in a first-person shooter (FPS)—a game in which the player views the world along the barrel of a gun. The behavior. of enemies used to be pre-scripted: wait until the player is nearby, pop up from behind a box, fire weapon, and then roll and hide behind another box, for example. But some games now use far more advanced' planning systems' imported from academia. 'Instead of scripts and hand-coded behavior, the AI monsters in an FPS can reason from first principles', says Dr. Mateas. They can, for example, work out whether the player can see them or not, seek out cover when injured, and so on. 'Rather than just moving between predefined spots, the characters in a war game can dynamically shift, depending on what's happening', says Fiona Sperry of Electronic Arts. If the industry is borrowing ideas from academia, the opposite is also true. Commercial games such as 'Unreal Tournament', which can be easily modified or scripted, are being adopted as research tools in universities, says Dr. Laird. Such tools provide flexible environments for experiments, and also mean that students end up with transferable skills. But the greatest potential lies in combining research with game development, argues Dr. Mateas. 'Only by wrestling with real content are the technical problems revealed, and only by wrestling with technology does it give you insight into what new kinds of content are possible, 'he says. According to the passage, good video-games used to be judged in terms of