Rob Davis

Rob Davis

Creative Director

Speaker Placeholder
Rob Davis photo}
Speaker Mask

Rob Davis is a Design Director and Creative Director with over 20 years of experience across AAA, mobile, PC, and console. His recent work focuses on the intersection of AI and game design — as a Creative Director he is using AI tools for prototyping games, and is researching new ways to design NPCs. As Studio Design Director at TreesPlease Games, he integrated AI tools into design and content pipelines, as Design Director at Take-Two he designed Lucasfilm-approved AI systems for a Star Wars title, and as Creative Director at Playniac he created AIs in games for the BBC and Channel 4. Rob has directed gameplay on titles ranging from Star Wars: Hunters (NaturalMotion/Zynga/Take-Two) to the award-winning Longleaf Valley, founded indie studio Playniac, and has been a BAFTA Games Awards juror and top-50 rated speaker at GDC. He lives in London.

Rob Davis is speaking at the following session/s

AI Wants to Play: A Game Designer’s Guide to AI in Every Stage of Development

AI is reshaping game development — but beyond the hype, what actually works in practice? This talk is a hands-on tour through every stage where AI can help, from initial prototyping to runtime gameplay, drawn from the speaker's recent work as a Studio Design Director owning the AI roadmap and a Creative Director using AI to prototype and ship games.

We start with AI as a design tool. Using platforms like Lovable, Claude, and Gemini, the speaker prototyped a suite of games in weeks rather than months — rapidly iterating on everything from mechanics to visual design. We'll cover audience modelling, gameplay videos and mockups, and provide an honest comparison of AI tools that actually matter to game teams today.

Next, we look at AI in the game itself. A custom card game system — inspired by DeepMind's Agent57 — trains a single 12,000-parameter neural network to master multiple card games using universal strategic features, evolving from PPO to Deep Monte Carlo with league training. Meanwhile, Rumour Mill and the Whispers show how LLMs can generate characters, dialogue, and even core mechanics at runtime.

We also examine shipped games pushing the frontier: Meaning Machine's Dead Meat, and Jam & Tea's Retail Mage, plus a look at DeepMind's Agent57 and SIMA 2 projects to contextualise where the research is heading.

Every example comes with honest results — what worked, what failed, and what's worth your time. Attendees leave with a practical framework for adopting AI across their own studios.

Session Takeaway

  • A practical decision framework for choosing the right AI tool and technique at every stage of development — from prototyping with Lovable and Claude, to modelling audiences with TinyTroupe or Gems, to choosing between RL, LLMs, and heuristics for your runtime NPCs.
  • How to use AI to radically accelerate game prototyping — with an honest comparison of tools (Lovable, Claude Desktop, Claude Code, Gemini) and real case studies showing what each is best at, so you can adopt the right tools for your team immediately.
  • How a single lightweight AI model can learn to play multiple games — a concrete architecture (27 universal features, 12K parameters) that's 196× smaller and 45× faster than brute-force approaches, including the evolution from PPO to DMC and the practical lessons from training failures.
  • Design principles for LLM-powered NPCs that players actually enjoy — drawn from shipped games (Dead Meat, Retail Mage) and the speaker's own prototypes, including why AI dialogue needs a strong authorial foundation, how to handle latency and too-capable NPCs, and when AI imperfection becomes the mechanic.
  • An AI tools landscape overview you can take back to your studio — covering prototyping, content generation, audience modelling, and runtime AI, with clear guidance on what's ready for production, what's experimental, and what's still hype.

Session speakers

Don't Miss Develop

Headline Sponsor

PlayStation logo

view all

Register