Video games never stop changing, and we wouldn't have it any other way. Evolve explores everything that is new or cutting edge in game development across all disciplines - from new technologies and new trends to new business models. Plus there's some crystal ball gazing and debate on how to create the best new game experiences for the next decade. Evolve sessions are now included within the main conference programme across Tuesday, Wednesday and Thursday.

Evolve sessions

'Virtual Insanity' - Unity Meets Immersive Story Telling: A Filmmakers First Experience of 360/VR and Unity

Tuesday 9th July: 17.00 - 17.45 : Room 5

Richard describes his experience making a 360/VR music video for UK D&B star Harry Shotta and his track 'Virtual Insanity'. Over the course of 18months learning 360 shooting, motion capture, 360 post-production and working with Unity and VR legend OliVR. What did he make? What were the lessons learned? How can Gamedevs and Filmmakers collaborate effectively? A newbies journey to the Immersive Frontier.

  • Making a 360 film is actually really complicated
  • The potential of collaborating with filmmakers on Immersive Projects
  • Gamedevs will inherit the earth

An Overview of Stadia Game Development

Wednesday 10th July: 14.00 - 14.45 : Room 2

Come learn the basics for what it means to make a game for Google's new game platform: Stadia. From the inputs / outputs of Stadia to how your team can work in a cloud development environment, Sam Corcoran from Google brings details to Develop.

An understanding of how game development works on Stadia and how you can get involved.

ArtyFax: An Augmented Reality Game Designed for and Created by Children.

Wednesday 10th July: 14.00 - 14.45 : Room 4

ArtyFax, designed as a term-long school project, turns research about local heritage sites into AR location-based games that are played on smartphones at the sites. During the project children create a story based on a historical character associated with the site and turn it into a series of videos. Each video is assigned to a specific location at the site. Children also make a model of a building or structure that would have been found at the site and turn this into a digital 3D representation using a process known as photogrammetry (you take photos of the model from all angles and software makes it AR ready). When playing the game, the user follows a trail to locations in sequence, triggering the playing of the videos (and thus telling the story). A projection of the model will be seen in AR when all the videos have been played. In the session I will be discussing how ARtyFax was conceived during a Design Sprint and how it has developed since then.

  • How a Design Sprint turned an idea about us creating a game into a game created by children
  • How we made a child-friendly process for creating augmented reality objects
  • How it's really, really difficult to get your games into schools even if you're doing it for free

Interactive Machine Learning for Game Controls Customisation and More Expressive Interactions

Wednesday 10th July: 15.00 - 15.45 : Room 4

In this session I’ll share some of the findings of my research in how machine learning can be applied in games and how this can heighten positive aspects in the player-computer interaction. There is an increasing trend of incorporating a diverse variety of sensors into videogame systems, ranging from game controllers to current VR/AR kits, yet there are no standard practices how to design sensor-based control schemes. Interactive machine learning (IML) is a novel interaction paradigm that involves users to iteratively build machine learning models. IML has been successfully used to allow designers and end users to fine-tune or even design wholly personal control schemes for interactive music and other applications. Could these techniques be used in games development? The research and technology presented in this session explores how using interactive machine learning in the design of sensor-based control schemes of digital games can improve the player experience and how simple it can be to use. This technology can be used to quickly prototype controls by directly showing human actions and computer responses without writing code! Players or designers may use interactive machine learning to design enjoyable control schemes for themselves and create unexpected playful interactions. I will run through specific tools and examples with Unity 3D that we researched and developed with the help of a Google grant and presented at the GDC 2019 AI track. No matter your background, all code and examples will be open sourced for you to explore after the session!

  • Cutting-edge research in AI assisted game development
  • How machine learning can be applied in creative scenarios and videogames
  • Understanding the “machine learning pipeline” in game development

Social VR from a Business, Technical and Design Point of View

Thursday 11th July: 14.00 - 14.45 : Room 5

This talk covers Cooperative Innovations 3 years of R&D, content creation and learnings in the Social VR space as well as the state of the market.

During the development of co-op RPG Raiders of Erda and ‘co-op yelling game’ Spaceteam, the team at Cooperative Innovations have created a set of tools and technologies that have helped them produce high-quality experiences with a relatively small team. 

Hear about how these tools came about and how they are used in day to day development.



The Xbox Adaptive Controller Story

Thursday 11th July: 12.15 - 13.00 : Room 2

It has been an amazing year since we announced the Xbox Adaptive Controller, and the response has overwhelmed the team. We’ll tell stories that haven’t been told, disclose things we have learned, and ask for help on where we should head next. We’ll also go through guidance that we give developers about creating games for folks with limited mobility.

Using VR to Train the Next Generation of Emergency Services

Thursday 11th July: 16.00 - 16.45 : Room 4

Training is an expensive business and is especially so for those in frontline emergency services. Because of the nature of their work, training in the real-world is essential but can become very costly, time consuming and with a varying level of quality. So what can we do as game developers to keep the quality of training high but the cost low? The answer: Virtual Reality. Over the course of 12 months we have created a Virtual Reality training platform for different emergency services. It allows individuals to train in several service roles (Police, Forensic, Fire) and to current training curriculum levels using tools that are common in game development. This presentation will use the project as a case study to explain how we adopted emerging technology, built our scenarios and tested them to compliment current training methods.

  • Adopting emerging technologies
  • Building scenarios
  • Increasing believability

Video Games to Tackle Today's Biggest Scientific Challenges

Tuesday 9th July: 14.00 - 14.45 : Room 5

There is an increasing number of citizen science games. These games not only carry educational values, they also help solve various scientific problems, from finding ways to build a quantum computer to building diagnosis tool to detect dementia. In this talk, we will identify scientific problems that can be better solved by games, we will look a the different ways they can be integrated into existing games and we will review a few design challenges they pose.

  • Learn how games can be a powerful platform for science
  • How they can engage players in research
  • How they can help cure diseases
  • How they can solve today's biggest challenges.

Keep up-to-date

Sign up to our e-newsletter for all the latest info for next year's event!

Sign up now

Get involved today!

There are many ways you can be part of Develop:Brighton 2020 - including speaking in the conference, taking a booth in the Expo or becoming a sponsor.

Find out more

Keep up to date with Develop 2020

Newsletter Sign Up