# Ml-Games-Course

## Docs

- [Publishing Schedule [[publishing-schedule]]](https://huggingface.co/learn/ml-games-course/communication/publishing-schedule.md)
- [Discord 101 [[discord-101]]](https://huggingface.co/learn/ml-games-course/unit0/discord101.md)
- [Onboarding](https://huggingface.co/learn/ml-games-course/unit0/setup.md)
- [The Course Syllabus](https://huggingface.co/learn/ml-games-course/unit0/syllabus.md)
- [How to get the most of the course?](https://huggingface.co/learn/ml-games-course/unit0/how-to-get-most.md)
- [Conclusion](https://huggingface.co/learn/ml-games-course/unit0/conclusion.md)
- [Welcome to the 🤗 Machine Learning for Games Course [[introduction]]](https://huggingface.co/learn/ml-games-course/unit0/introduction.md)
- [Your Game Demo](https://huggingface.co/learn/ml-games-course/unit0/game-demo.md)
- [The Team](https://huggingface.co/learn/ml-games-course/unit0/who-are-we.md)
- [AI Voice Actors 🤖](https://huggingface.co/learn/ml-games-course/unit2/ai-voice-actors.md)
- [Sound Effects Generation 🔊](https://huggingface.co/learn/ml-games-course/unit2/sound-generation.md)
- [Conclusion](https://huggingface.co/learn/ml-games-course/unit2/conclusion.md)
- [Introduction](https://huggingface.co/learn/ml-games-course/unit2/introduction.md)
- [Texture Generation 🖼️](https://huggingface.co/learn/ml-games-course/unit2/texture-generation.md)
- [Animation Generation 💃](https://huggingface.co/learn/ml-games-course/unit2/animation-generation.md)
- [Code Assistants 👩‍💻](https://huggingface.co/learn/ml-games-course/unit2/code-assistants.md)
- [Music Generation 🎵](https://huggingface.co/learn/ml-games-course/unit2/music-generation.md)
- [2D Assets Generation 🎮](https://huggingface.co/learn/ml-games-course/unit2/2d-generation.md)
- [Classical AI in Unity](https://huggingface.co/learn/ml-games-course/unitbonus1/ai-in-unity.md)
- [AI Studies by GMTK (Game Maker Toolkit)](https://huggingface.co/learn/ml-games-course/unitbonus1/ai-gmtk.md)
- [Conclusion](https://huggingface.co/learn/ml-games-course/unitbonus1/conclusion.md)
- [Classical AI in Games [[classical-ai-in-games]]](https://huggingface.co/learn/ml-games-course/unitbonus1/introduction.md)
- [Additional readings about the history of AI in Video Games](https://huggingface.co/learn/ml-games-course/unitbonus1/additional-readings.md)
- [Classical AI in Unreal Engine](https://huggingface.co/learn/ml-games-course/unitbonus1/ai-in-unreal.md)
- [AI 101 by AI and Games (Dr. Tommy Thompson)](https://huggingface.co/learn/ml-games-course/unitbonus1/ai-and-games.md)
- [The power of Sentence Similarity 🤖](https://huggingface.co/learn/ml-games-course/unit1/sentence-similarity-explained.md)
- [How to run an AI model: local vs remote](https://huggingface.co/learn/ml-games-course/unit1/local-vs-api.md)
- [What can you do now?](https://huggingface.co/learn/ml-games-course/unit1/next-steps.md)
- [What is Hugging Face 🤗?](https://huggingface.co/learn/ml-games-course/unit1/what-is-hf.md)
- [Conclusion](https://huggingface.co/learn/ml-games-course/unit1/conclusion.md)
- [Introduction](https://huggingface.co/learn/ml-games-course/unit1/introduction.md)
- [Let's build our smart robot NPC demo 🤖](https://huggingface.co/learn/ml-games-course/unit1/make-demo.md)
- [Make your own demo 🔥](https://huggingface.co/learn/ml-games-course/unit3/customize.md)
- [A deep dive on the NPC-Playground](https://huggingface.co/learn/ml-games-course/unit3/demo.md)
- [Conclusion](https://huggingface.co/learn/ml-games-course/unit3/conclusion.md)
- [Introduction](https://huggingface.co/learn/ml-games-course/unit3/introduction.md)
- [Step 2: Let's write the Game Design Document ✍️](https://huggingface.co/learn/ml-games-course/demo1/game-design-document.md)
- [Conclusion](https://huggingface.co/learn/ml-games-course/demo1/conclusion.md)
- [Let's define your Game idea 💡](https://huggingface.co/learn/ml-games-course/demo1/introduction.md)
- [Step 1: Crafting the Game Idea 💡](https://huggingface.co/learn/ml-games-course/demo1/game-idea.md)
- [Is your first game? Watch these videos](https://huggingface.co/learn/ml-games-course/demo1/first-game.md)

### Publishing Schedule [[publishing-schedule]]
https://huggingface.co/learn/ml-games-course/communication/publishing-schedule.md

# Publishing Schedule [[publishing-schedule]]

The course is now **self-paced and will not have new future units**. We **don't provide a Certificate of Completion for this course**.
But we continue to write tutorials on how to use AI in Games here 👉 https://thomassimonini.substack.com/



<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/communication/publishing-schedule.mdx" />

### Discord 101 [[discord-101]]
https://huggingface.co/learn/ml-games-course/unit0/discord101.md

# Discord 101 [[discord-101]] 

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/discord101.jpg" alt="Discord" width="100%"/>

Discord is a free chat platform. There is a Hugging Face Community Discord server with more than 90K members,  you can **[join with a single click here](https://discord.gg/hugging-face-879548962464493619)**

Starting in Discord can be a bit intimidating, **so let me take you through it**.

You'll choose your interests when you [**sign up to our Discord server**](https://discord.gg/hugging-face-879548962464493619). Make sure to **click "ML for Game Development"**, and you'll get access to the "Game Dev" category containing all the course-related channels. If you feel like joining even more channels, go for it! 🚀

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/role-discord.png" alt="ML for Game Development selection in Discord" />

Then click next, and you'll get to **introduce yourself in the `#introduce-yourself` channel**.

Here's all the ML for Game Development channels:

- `ml-4-games-dev`: where you can **exchange about ML for games development.**
- `ml-4-games-i-made-this`: where you can **share your projects and models**.

Here's some advice:

- Don't hesitate **to participate often in the Discord channels**, it's how you'll get the most of the course by meeting new people interested in AI in games, finding coworkers, and getting advice.
- There are **voice channels** you can use as well, although most people prefer text chat.
- You can **use markdown style** for text chats. So, if you're writing code, you can use that style. Sadly, this does not work for links.
- Don't hesitate **to create threads when you write a message**. It's a good idea when **it's a long conversation.**
- We want to keep this **Discord** server safe for everyone, so if you see some bad behavior, don't hesitate to tag @admin.


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit0/discord101.mdx" />

### Onboarding
https://huggingface.co/learn/ml-games-course/unit0/setup.md

# Onboarding

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/onboard.jpg" alt="Onboard" width="100%"/>

After all this information, it's time **to get started**. We're going to do two things:

1. **Create your Hugging Face account** if you don't have one.
2. **Sign up to Discord and introduce yourself** (don’t be shy 🤗)

### **Let's create my Hugging Face account**

(If you don't have one), create an account to Hugging Face **[here](https://huggingface.co/join)**

### **Let's join our Discord server**

You can now sign up for our Discord Server. This is the place where you **can chat with the community and with us and more**

If this is your first time using Discord, we wrote a Discord 101 to get the best practices. Check the next section.

👉🏻 Join our discord server **[here.](https://discord.gg/hugging-face-879548962464493619)**

When you join, remember to introduce yourself in #introduce-yourself and sign up for ML-4-Game-Development in #role-assignments.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/role-discord.png" alt="ML for Games Discord Role Assignment" />

Here's all the ML for Game Development channels:

- `ml-4-games-dev`: where you can **exchange about ML for games development.**
- `ml-4-games-i-made-this`: where you can **share your projects and models**.


Congratulations! **You've just finished the onboarding**. You're now ready to start learning how to make games with AI. We still have two sections in this unit with advice on how to get the best out of this course and how to use Discord.

### **Keep Learning, stay awesome 🤗**


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit0/setup.mdx" />

### The Course Syllabus
https://huggingface.co/learn/ml-games-course/unit0/syllabus.md

# The Course Syllabus

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/syllabus.jpg" alt="Syllabus" width="100%"/>

> [!TIP]
> The course is now **self-paced and will not have new future units**. We **don't provide a Certificate of Completion for this course**.
> But we continue to write tutorials on how to use AI in Games here 👉 https://thomassimonini.substack.com/

## What the course looks like

The course is composed of:

- *A theory part*: where you learn the **concepts in theory**.
- *A hands-on part*: where you'll work on demos we created to implement the AI.
- Your demo: you'll build this during the course!


## What this course covers 🗺️

This course will cover:

- How to use **AI models in your games**.
- How to use AI models with the **free Hugging Face Inference API**.
- How to use **AI models in Unity with Sentis**.
- A list of amazing **AI tools to help you create voices, sounds, music, textures, and assets**.


## What this course does not cover

This course **is not an introduction to game development**. You should have some skills in Unity or Unreal to follow this course effectively.

This course **is not an introduction to Machine Learning or AI**. You may find this course helpful if you're interested in the concepts behind language processing with ML: [Natural Language Processing with Transformers](https://huggingface.co/learn/nlp-course).


## Recommended Pace

Each unit in this course is designed to be completed **in 1 week**, with approximately 3-4 hours of work per week. 

However, you can take as much time as necessary to complete the course, **there are no deadlines**. If you want to dive deeper into a topic, we'll provide additional resources to help you achieve that.

The recommended pace is **to work on one unit per week**. For the demo you need to make, we recommend **to work from two weeks to one month maximum**.


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit0/syllabus.mdx" />

### How to get the most of the course?
https://huggingface.co/learn/ml-games-course/unit0/how-to-get-most.md

# How to get the most of the course?

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/advice.jpg" alt="Advice" width="100%"/>

Here are the **best practices to get the most of the course**:

### 1. Join and Engage on our Discord Server

You can now sign up for our Discord Server. This is the place where you **can chat with the community and with us and more**

👉🏻 Join our discord server **[here.](https://discord.gg/hugging-face-879548962464493619)**

### 2. Modify our demos

While following our demos step by step is essential to learn, **don't hesitate to get hands-on by modifying them**. Experimentation is a powerful learning tool 🧪.

### 3. Share this course with your community ❤️ 

**Spread the love**: By sharing the course, **you help us reach more game developers who are interested in making games with AI**. 

If you like the course, **don't hesitate to star ⭐ [the course's repository](https://github.com/huggingface/making-games-with-ai-course)** . This helps us a lot 🤗.

You can also share the course on X (Twitter), LinkedIn, and other social media.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/thumbnail-com.jpg" alt="Thumbnail Communication" width="100%"/>

### 4. Share what you're doing online

Sharing your work online is a fantastic way to showcase your progress and skills. We encourage you to share your progress on your game demo during the course.

If you don't know how to start sharing, check this fantastic tutorial by [Buildspace's founder Farza](https://twitter.com/FarzaTV/)

<iframe width="560" height="315" src="https://www.youtube.com/embed/QLgOYMCSqeE?si=Y6dRhpfiI_puUoFJ&amp;start=731" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>


## I found a bug, or I want to improve the course

**Contributions are welcome 🤗**

- If you found a bug 🐛, please [open an issue and describe the problem](https://github.com/huggingface/making-games-with-ai-course/issues/new/choose).
- If you want to improve the course, you can [open a Pull Request](https://github.com/huggingface/making-games-with-ai-course/compare).

## I still have questions

Please ask your question on our Discord server, #ml-4-game-dev.


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit0/how-to-get-most.mdx" />

### Conclusion
https://huggingface.co/learn/ml-games-course/unit0/conclusion.md

# Conclusion

Congratulations on finishing this introduction unit!

You're now ready **to start your exciting journey of using machine learning in games** 🥳

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/journey.png" alt="Game Dev Journey" width="100%"/>

In the first two Units, you will:

- Learn to run AI models locally with Unity Sentis and make your first intelligent AI NPC 🤖
- Define your Demo by writing the Game Design Document 🪄


<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/journey-next.jpg" alt="Game Dev Journey" width="100%"/>


Keep Learning, Stay Awesome 🤗


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit0/conclusion.mdx" />

### Welcome to the 🤗 Machine Learning for Games Course [[introduction]]
https://huggingface.co/learn/ml-games-course/unit0/introduction.md

# Welcome to the 🤗 Machine Learning for Games Course [[introduction]]

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/thumbnail.jpg" alt="Thumbnail" width="100%"/>

Welcome to the course that will teach you the most fascinating topic in game development: **how to use powerful AI tools and models to create unique game experiences**.


New AI models are revolutionizing the Game Industry in two impactful ways:

- On **how we make games**:
    - Generate textures using AI
    - Using AI voice actors for the voices. 

- How we **create gameplay**:
    - Crafting smart Non-Playable Characters (NPCs) using large language models.


This course will **teach you**:

- How to **integrate AI models for innovative gameplay**, featuring intelligent NPCs.
- How to use AI tools to help your game development pipeline.


In this introduction unit, you'll:

- Learn more about the **course content and the syllabus**.
- Learn the **course requirements**.
- Create your Hugging Face Account (it's free)
- Sign up to our Discord server to chat with your classmates and us (the Hugging Face team).

Let's get started!

## What to expect?

In this course, you will:

- 🤖 Learn to use **powerful chat models to build intelligent NPC**.
- 🪄 Run **these powerful AI models locally or with cloud APIs.**
- 🎨 Use AI tools **to accelerate your game development**: generate voices with text-to-speech models using image generation tools...
- 🎮 Work on **exciting game demos**.
- 👩‍🎨 **Build Your Own Game Demo**.

And more!

At the end of this course, you'll get a **solid foundation on using AI models in your games and AI tools in your game development**.

Remember to [sign up for the course](http://eepurl.com/iCWDQw) (we are collecting your email to be able to send you the links when each Unit is published and give you information about the challenges and updates).

Sign up 👉 [here](http://eepurl.com/iCWDQw)


For now, all our content uses the [Unity Game Engine](https://www.unity.com). In the future, we will add content about Unreal Engine, Godot, and more.

Naturally, for the demo, you can use any Game Engine you want (or no game engine), **but we advise you to use Unity**.

## But... I don't know how to make games!

This course **is not an introduction to game development**. You should have some skills in Unity to follow this course.

If it's not the case. Don't worry, that's the perfect time to start! To start making games with Unity, check out this fantastic course: [Create with Code](https://learn.unity.com/course/create-with-code), where you'll learn to make five different games.

With this create with code free course, **you'll get everything you need to be able to follow the Machine Learning for Games Course**.

Create with code course 👉 https://learn.unity.com/course/create-with-code

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/create_with_code.png" alt="Thumbnail" width="100%"/>


In addition, you can watch this excellent introduction to Unity by [GMTK (Game Maker's Toolkit)](https://www.youtube.com/@GMTK)

<iframe width="560" height="315" src="https://www.youtube.com/embed/XtQMytORBmM?si=D6Q-8kGZsJIRm7Pe" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>



## Are we going to learn to train/fine-tune Transformer models and other AI models?

We don't plan to have a unit on this for now. **But the course will be improved with your feedback**.

We already have a lot of free courses if you want to go deeper into:

- Learning [Natural Language Processing with Transformers](https://huggingface.co/learn/nlp-course)
- Learning [Audio Processing with Transformers (Speech-To-Text, Text-To-Speech...)](https://huggingface.co/learn/audio-course/chapter0/introduction)
- Learn to [build Generative AI Applications with Gradio](https://www.deeplearning.ai/short-courses/building-generative-ai-applications-with-gradio/)


## Your Goal during this course 🏆: Making Your Own Game Demo

> [!TIP]
> The course is now **self-paced and will not have new future units**. We **don't provide a Certificate of Completion for this course**.
> But we continue to write tutorials on how to use AI in Games here 👉 https://thomassimonini.substack.com/

During this course, you will work on some demos we've made. But the most exciting part is that during the whole course **you will also work on your demo**. 

You'll build a game demo using either:

- **AI tools to help you build the game** (for instance, texture generation, AI voice actors...).

- And or using an **AI in the game as gameplay or for the NPC**.

## Don't forget to follow AI Playbook for Game Developers 🎮

<img src="https://substackcdn.com/image/fetch/w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F486ef2ec-4965-4b25-bf3b-65dfefec2445_1500x500.jpeg"/>

In addition to this course, I'm writing a weekly blog with tutorials, articles.

From creating smarter NPCs to unleashing the power of Generative AI, I’m here to demystify the process with easy-to-follow tutorials and demos.

👉 https://thomassimonini.substack.com/



<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit0/introduction.mdx" />

### Your Game Demo
https://huggingface.co/learn/ml-games-course/unit0/game-demo.md

# Your Game Demo

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/game-demo.jpg" alt="Game Dev" width="100%"/>

> [!TIP]
> The course is now **self-paced and will not have new future units**. We **don't provide a Certificate of Completion for this course**.
> But we continue to write tutorials on how to use AI in Games here 👉 https://thomassimonini.substack.com/

The main goal of this course is to **learn how to use these new powerful AI models** for game development, either during game creation or as part of the gameplay to unlock innovative experiences!

And the best way to learn is **by doing**. That's why we think **it's a good idea to follow this course with a demo idea.**

If you don't have ideas yet, **don't worry**; we wrote some "Demo Units" that will help you define your game demo and avoid common pitfalls (scope too big, etc.).

## How do I choose which Game Engine to use?

It's up to you. For now, **we advise you to use Unity** since we produced most of the content for this v1 of the course with their tools. 

We strongly advise you to choose a game engine and not develop a game from scratch with only code. The goal of building a demo in one month is already a big step, and adding complexity is not a good idea.

## But... I don't know how to make games!

Don't worry, that's the perfect time to start! If you want to start making games with Unity, check their amazing [Create with Code course](https://learn.unity.com/course/create-with-code), where you'll learn how to make five different games. This free introduction **will teach you everything you need to know** to be able to follow our ML for Games Course.

Create with code course 👉 https://learn.unity.com/course/create-with-code

In addition, you can watch this excellent introduction to Unity by [GMTK (Game Maker's Toolkit)](https://www.youtube.com/@GMTK)

<iframe width="560" height="315" src="https://www.youtube.com/embed/XtQMytORBmM?si=D6Q-8kGZsJIRm7Pe" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

## What tools do I need?

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/tools.jpg" alt="Tools needed for the course" width="100%"/>

The minimum tools you need are:

- *A computer* with an internet connection.
- A *Hugging Face Account*: (it's free) to use AI models and publish your game demos.
- A *Game Engine License*. A free version is enough in most cases. Naturally, you could create your game from scratch, but we strongly advise using a Game Engine. We generally recommend Unity, as the first iteration of this course uses Unity tools. You can also use Unreal if **you're prepared to translate the concepts to their environment**.


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit0/game-demo.mdx" />

### The Team
https://huggingface.co/learn/ml-games-course/unit0/who-are-we.md

# The Team

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/team.jpg" alt="Team" width="100%"/>

## The Main Author: Thomas Simonini

My name is Thomas Simonini; I'm a **Game Developer working at Hugging Face** 🤗 focused on how to **integrate AI models in games to create new experiences** 🪄 .

In addition to this course, I'm writing in my blog [AI Playbook for Game Developers 🎮](https://thomassimonini.substack.com/), where I explore the intersection of AI and game development with tutorials and demos showcasing the integration of Machine Learning and AI models into video games. 🎮 From Smart NPC with Large Language Models to Generative AI. And making video tutorials on [YouTube](https://www.youtube.com/c/ThomasSimonini?sub_confirmation=1).


You can find me on:
- [X (Twitter)](https://twitter.com/ThomasSimonini)
- [LinkedIn](https://fr.linkedin.com/in/simoninithomas/en)
- [YouTube](https://www.youtube.com/c/ThomasSimonini?sub_confirmation=1)
- [My AI Playbook for Game Developers](https://thomassimonini.substack.com/)


## Contributors and reviewers

### Dylan Ebert [Contributor/Reviewer]

Dylan is a Developer Advocate Engineer at Hugging Face with a PhD in Computer Science from Brown University.

His work lies at the intersection of ML, 3D, and Game Development.

You can find Dylan on:
- [X (Twitter)](https://twitter.com/dylan_ebert_)
- [TikTok](https://www.tiktok.com/@individualkex)
- [YouTube](https://www.youtube.com/c/IndividualKex)

### Omar Sanseviero [Reviewer]

Omar is the Chief Llama Officer at Hugging Face. 🦙

He works at the intersection of Open Source, Product, Research, and technical communities, leading multidisciplinary teams and open source and science collaborations.

You can find Omar on:
- [X (Twitter)](https://twitter.com/osanseviero)
- [LinkedIn](https://www.linkedin.com/in/omarsanseviero/)


### Pedro Cuenca [Reviewer]

Pedro is a Machine Learning Engineer at Hugging Face, interested in image generation and on-device models. He also likes games :)

You can find Pedro on:
- [X (Twitter)](https://twitter.com/pcuenq)
- [LinkedIn](https://www.linkedin.com/in/pedro-cuenca-67a447/)




<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit0/who-are-we.mdx" />

### AI Voice Actors 🤖
https://huggingface.co/learn/ml-games-course/unit2/ai-voice-actors.md

## AI Voice Actors 🤖

AI voice actors allow you **to use customizable voices that perfectly match your characters or narration needs.**

These tools allow you not only to prototype with these voices but also **use them during production 🔥**

## Replica Studios 💸 🔒

[Replica Studios](https://www.replicastudios.com/) is my **favorite voice generation tool**.

<iframe width="560" height="315" src="https://www.youtube.com/embed/W_VaOiXlXqo?si=3HiZ1c1g62BhZjIi" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

It provides a **voice library and a studio browser tool** that allows you to generate realistic voices.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/replica.jpg" alt="Replica Studios" />

In addition to being used to **produce offline dialog,** it can be **used live** in the game, thanks to their API.

<iframe width="560" height="315" src="https://www.youtube.com/embed/aFnWgl0jxdE?si=zUyRSjeB1zRljzTM" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

👉 [Replica Studios Website](https://www.replicastudios.com/)

## Coqui 🆓 🤗

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/coqui.jpg" alt="Coqui logo" />

[Coqui](https://huggingface.co/spaces/coqui/xtts) is an open-source platform for speech technology that allows you to **create realistic voices with their XTTS models.**

Unfortunately, Coqui company was closed, **but the model is still available and usable in the Hugging Face Space** ❤️

👉 https://huggingface.co/spaces/coqui/xtts

## Parler TTS 🆓 🤗

[Parler TTS](https://huggingface.co/spaces/parler-tts/parler_tts_mini) is an open-source model that **generates high-quality speech**. 

Its features can be **controlled using a simple text prompt** (gender, background noise, speaking rate, pitch, and reverberation).

You can try it here 👉 https://huggingface.co/spaces/parler-tts/parler_tts_mini

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit2/ai-voice-actors.mdx" />

### Sound Effects Generation 🔊
https://huggingface.co/learn/ml-games-course/unit2/sound-generation.md

# Sound Effects Generation 🔊

One of the most underrated uses of AI in game production is **sound generation.**

If you’ve made a game, you know **how critical it is to have good immersive sound effects and how hard it is to find or produce them.**

Fortunately, **two tools are coming to help us with that, but they are not yet available.**

## Unity Muse Sound 🔒 💸 🛑

Unity Muse Sound will allow you to generate sound with a text prompt. It was announced during the GDC in March.

We only got a **preview of what it will look like during GDC, but it’s promising 🔥**

## Eleven Labs Sound Effects 🔒 💸 🛑

Eleven Labs Sounds Effectsb **will allow you to generate high-quality sounds with text prompts.**

For now, the product is on a waiting list signup 👉 https://elevenlabs.io/ai-sound-effects-for-sora

## AudioLDM2 models 🆓 🤗

AudioLDM2 models are able to generate high quality sound effects. **Unfortunately their license is cc-by-nc-sa-4.0** it means you **can't use it for commercial projects**.

You can try it here 👉 https://huggingface.co/spaces/Fabrice-TIERCELIN/Text-to-Audio



<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit2/sound-generation.mdx" />

### Conclusion
https://huggingface.co/learn/ml-games-course/unit2/conclusion.md

# Conclusion

Congratulations on finishing this Unit, hope you liked try these tools and saw how powerful they are.

In the upcoming units **we're going to use some of them for two game demos 🔥**.

<iframe width="560" height="315" src="https://www.youtube.com/embed/MJyUCiZB0uw?si=1eXnLU9hbMaNi4wm" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Don't hesitate to also check [Awesome AI Tools for Game Developers list](https://github.com/simoninithomas/awesome-ai-tools-for-game-dev) **for more tools**.

You have some tools you think we should add? We wrote a [Awesome AI Tools for Game Developers list](https://github.com/simoninithomas/awesome-ai-tools-for-game-dev) where you can add tools.

In addition to this Unit that provides you with the best tools, we wrote a blogpost where I dive deep into:

- Why integrating AI is essential: Understanding how it can amplify your creativity.
- Cutting through the buzz: Identifying where AI is a game-changer in your workflow and recognizing its limitations.
- The mindset for successful integration: Adopting a growth mindset to leverage AI as a tool rather than seeing it as a threat.

You can read it here 👉 https://substack.com/home/post/p-151913664

<img src="https://substackcdn.com/image/fetch/w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa69b6c0f-f0b1-48f8-8a1c-1d96647cd71b_1920x1080.jpeg"/>

Keep Learning, Stay Awesome 🤗


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit2/conclusion.mdx" />

### Introduction
https://huggingface.co/learn/ml-games-course/unit2/introduction.md

# Introduction

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/thumbnail.jpg" alt="Thumbnail" />

Welcome to this exciting new Unit all about AI tools for Game Development 🧰!

Up until now, when we studied ML for Games in this course, we learned **how to integrate AI models inside our games to unlock new and exciting gaming experiences.**

However, another big part of ML for Games **is using powerful AI tools during the pre-production and production phases.**

These tools have two advantages:

- **Fasten your game development workflow** by automating some creation tasks.
- **Empower your game development skills** by allowing you to produce content you could not do before with a small team (2D assets, music, animation, voice actors... ).

In this Unit, **we cherry-picked two tools per topic** that we love and that you should try in your game development.

**We provide a code system** to help you know if the tool is open-source or not and if it is free or not.

🤗: Open-source

🔒: Closed-source (proprietary)

💸: Paid tool (often with a free tier)

🆓: Free tool

So let's get started!

(The illustration of this Unit thumbnail was made with [Scenario](https://www.scenario.com) )

In addition to this Unit that provides you with the best tools, we wrote a blogpost where I dive deep into:

- Why integrating AI is essential: Understanding how it can amplify your creativity.
- Cutting through the buzz: Identifying where AI is a game-changer in your workflow and recognizing its limitations.
- The mindset for successful integration: Adopting a growth mindset to leverage AI as a tool rather than seeing it as a threat.

You can read it here 👉 https://substack.com/home/post/p-151913664


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit2/introduction.mdx" />

### Texture Generation 🖼️
https://huggingface.co/learn/ml-games-course/unit2/texture-generation.md

# Texture Generation 🖼️

Texture Generation tools allow you to **generate stylized textures or PBR textures for your game assets**.

## Unity Muse Textures Generator 🔒💸

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/muse_texture.jpg" alt="Muse Textures" />

👉 [Muse Documentation](https://docs.unity3d.com/Packages/com.unity.muse.texture@4.0/manual/index.html)

👉 [Generate Textures with Muse tutorial](https://learn.unity.com/tutorial/generate-textures-with-muse)

## DreamTextures 🆓 🤗

With [Dream Textures](https://github.com/carson-katri/dream-textures), you can create different types of textures with a simple text prompt.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/dream_textures.jpg" alt="Dream Textures" />

👉 https://github.com/carson-katri/dream-textures

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit2/texture-generation.mdx" />

### Animation Generation 💃
https://huggingface.co/learn/ml-games-course/unit2/animation-generation.md

# Animation Generation 💃

Creating realistic animations is one of the **most challenging tasks in game development.**

Usually, animation is **either made by hand** (but labor-intensive) or using **expansive motion capture technology.** Fortunately, we are starting to see some AI tools **that generate animation based on a text prompt and some that extract animations from a video.**

## Generate Animation with Unity Muse Animate 🔒💸

With Muse Animate, you can **generate animations based on text-prompt directly on your Unity project 🤯.**

We're going to use it in an upcoming unit 🔥.

👉 https://muse.unity.com/en-us/explore

## Do motion capture without expansive tools with [Plask.AI](http://plask.ai/) 🔒💸

With [Plask.AI](http://plask.ai/) you **simply capture a video with your phone, import it to the website and you can extract and edit the animations.**

👉 https://plask.ai/

<iframe width="560" height="315" src="https://www.youtube.com/embed/OKHfD7WEzHs?si=po8dHwyuyX_N_bMR" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>



<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit2/animation-generation.mdx" />

### Code Assistants 👩‍💻
https://huggingface.co/learn/ml-games-course/unit2/code-assistants.md

# Code Assistants 👩‍💻

Code Assistants are helpful when you **need help coding a function or when you find a bug in your code and want to know why**.

## Unity Muse Chat 💸 🔒

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/muse.jpg" alt="Muse chat" />

Muse Chat is a **conversation AI in Unity Editor** that can answer your Unity questions and help you with code and debugging.

What's impressive about this tool is that **it's context-aware of the project.** This means it has information about your project, the game objects, etc., so you don't need **to explain it in the prompt.**

It's a handy **tool to accelerate your game development workflow**.

For instance, I'm using it to help me build the next unit demo: it's an **action-adventure game about aliens 👽 invading a space station. Your goal is to outsmart them and flee.**

<iframe width="560" height="315" src="https://www.youtube.com/embed/MJyUCiZB0uw?si=1eXnLU9hbMaNi4wm" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

You can win the game by leaving the station, but to do that, you need to find your equipment, escape hatch password, etc., without getting noticed.

Sounds exciting? **It's a demo you'll be able to play and modify during this course 🔥**

For this game, I need the red light to flicker **to convey the sense of alertness in the space station.**

So, I clicked on my point light and **asked Muse how to create this.**

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/musechat_example_1.png" alt="Muse chat Example 1" />

What's impressive about Muse is that in addition to providing the code, it shared with me a **well-defined explanation of why the code works this way and its sources.**

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/musechat_example_2.png" alt="Muse chat Example 2" />
<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/musechat_example_3.png" alt="Muse chat Example 3" />
<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/musechat_example_4.png" alt="Muse chat Example 4" />

And this is the result:

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/light.gif" alt="Muse chat Light example" />


If you want to try it:

👉 [Muse Website (15 days free)](https://unity.com/products/muse)

👉 [They wrote a tutorial on the best practices](https://learn.unity.com/tutorial/get-started-with-muse-chat#)

## GitHub Copilot 💸 🔒

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/github_copilot.jpg" alt="GitHub Copilot" />

If you're not using Unity but another game engine, you should use GitHub Copilot. **It's one of my favorite tools when I code in Python.**

**Copilot is not a free tool**, but it's **free for verified students, teachers, and maintainers of popular open-source projects**. So, if you contributed to open-source projects, **you are eligible to use it for free.**

👉 [Copilot Website](https://github.com/features/copilot)

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit2/code-assistants.mdx" />

### Music Generation 🎵
https://huggingface.co/learn/ml-games-course/unit2/music-generation.md

# Music Generation 🎵

Music Generation offers game developers **the ability to create music that perfectly matches the intended mood or atmosphere** of the scene.

## Udio 🔒💸

[Udio](udio.com/) is a **generative AI model that produces music based on prompts**.

What’s interesting is that you c**an generate music instrumental or with lyrics.**

For now, you can make 600 daily generations (music)!

Udio can generate excellent music with lyrics; you can check their examples on the website.

But for the game demo I’m working on for the future unit, where you’re stuck in a ship with aliens, I **needed something abstract and uneasy**, so I used this prompt: *a piece of instrumental music on space void and threatening aliens*

And I got this 👉 https://www.udio.com/songs/pYXsK6GsMiwyMd8B3ZEU7R, which this is perfect for my game menu

The model generates 33 seconds of the music, but you can then **ask the model to extend it.**

If you want to get started with Udio, check this tutorial:

<iframe width="560" height="315" src="https://www.youtube.com/embed/iRn4LcJ6CR8?si=tZ7BEM48UU8nddql" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

## **MusicGen** 🆓 🤗

This model, made by Meta AI, allows you to **generate 15 seconds of music** with a text prompt and sound file to help control the style of the music.

I used the same prompt as above and I got this lovely result:

<iframe width="560" height="315" src="https://www.youtube.com/embed/_2dfQlvTapQ?si=rYc6S0RyrROQmqDE" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>


You can try it on Hugging Face Spaces 👉 https://huggingface.co/spaces/facebook/MusicGen

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit2/music-generation.mdx" />

### 2D Assets Generation 🎮
https://huggingface.co/learn/ml-games-course/unit2/2d-generation.md

# 2D Assets Generation 🎮

One **of the most critical elements in game development for 2D games is the ability to produce high-quality 2D assets with a beautiful aesthetic.**

For a long time, AI was **unusable because of its lack of consistency** (you were not able to generate the perfectly same character in multiple generations, for instance). Fortunately, with LoRA and other innovations, this problem is a thing of the past.

In this section, I'll present my favorite tool it's [Scenario](https://www.scenario.com/) ❤️ It's an impressive tool **we will use in a future Unit of the course.**


## Scenario 🤗🔒💸

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/scenario.jpg" alt="Scenario"/>

[Scenario](https://www.scenario.com/) is a platform that allows you **to generate 2D assets using text-to-image models and fine-tune these models with specific styles.**

The foundation model is Stable Diffusion (SD 1.5 and SDXL), two **open-source models.**

The power of Scenario comes from its platform and its powerful tools.

First, you can **easily fine-tune your own LoRA** using generated images or original ones.

Or you can **create a composition using Scenario’s library of Platform Models** and even blend them with your own.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/ScenarioLoRAComposition_new.png" alt="Scenario Scene Example" />

You can do the same for characters.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/ScenarioCharacterLoRAComposition_new.png" alt="Scenario Character Example" />

- With the upscaler tool you can increase **the quality of the output up to 10k resolution** and **correct some artifacts made by the generation.**

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/upscale.gif" alt="Upscaler" />

Here you can see some examples of content you can make with it:

- **A platformer**

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/ScenarioPlatformerImage_new.png" alt="Scenario Example" />

To see an example in video 👇

<iframe width="560" height="315" src="https://www.youtube.com/embed/txrCuIQlb80?si=IVxf5VKgcLmQCIgz&amp;start=3" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

If you want to make your own platformer then, check this tutorial 👉  [Designing a Unity Platformer Level: Scenario Development](https://help.scenario.com/designing-a-unity-platformer-level-scenario-development)

- **Buildings for 2D RTS Games**

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/rts_example1.gif" alt="Scenario RTS example" />

See in video **how easy it is to reskin your isometric building assets** 👇

<iframe width="560" height="315" src="https://www.youtube.com/embed/45dh2pMllRw?si=tKd-Qn5wCt7X_ITc" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit2/rts.png" alt="Scenario RTS example" />

Tutorial: Build an Isometric City 👉 https://www.scenario.com/post/isometric-city-builder

There are so much more tools and features to cover, **but the best is that you try them by yourself** 👉 https://www.scenario.com/

Scenario Documentation 👉 https://help.scenario.com/

Scenario Twitter Profile (for the latest updates) 👉 https://twitter.com/Scenario_gg


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit2/2d-generation.mdx" />

### Classical AI in Unity
https://huggingface.co/learn/ml-games-course/unitbonus1/ai-in-unity.md

# Classical AI in Unity

Now that we studied the theory behind classical AI in Games, the best way to experiment **is to try to make your own NPCs**.

The best course to help you getting started with AI in Unity is [Penny De Byl's](https://www.youtube.com/@AliveTeam) one called [Artificial Intelligence for Beginners](https://learn.unity.com/course/artificial-intelligence-for-beginners), where you'll learn to code most of the algorithms we talked about.

- [Penny De Byl's Artificial Intelligence for Beginners Course](https://learn.unity.com/course/artificial-intelligence-for-beginners)
- [Navigation Meshes with Unity](https://www.youtube.com/watch?v=CHV1ymlw-P8&ab_channel=Brackeys)
- [Behavior Designer (paid tool)](https://www.youtube.com/watch?v=T_of4_jRoJA&t=131s&pp=ygUTYmVoYXZpb3IgdHJlZSB1bml0eQ%3D%3D)


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unitbonus1/ai-in-unity.mdx" />

### AI Studies by GMTK (Game Maker Toolkit)
https://huggingface.co/learn/ml-games-course/unitbonus1/ai-gmtk.md

# AI Studies by GMTK (Game Maker Toolkit)

<a href="https://www.youtube.com/@GMTK">Game Maker's Toolkit </a> is a deep dive into game design, level design, and game production hosted by Mark Brown.

It's one of the best games in Game Design and Level Design. If you're interested in Game Design studies, check his <a href="https://www.youtube.com/@GMTK"> YouTube Channel </a>


## What's making a good AI?

<iframe width="560" height="315" src="https://www.youtube.com/embed/9bbhJi0NBkk?si=TpvumBQYH45f_2c1" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>


## The AI Behind The Sims

<iframe width="560" height="315" src="https://www.youtube.com/embed/9gf2MT-IOsg?si=OyVzpJRgl9tcmZ32" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unitbonus1/ai-gmtk.mdx" />

### Conclusion
https://huggingface.co/learn/ml-games-course/unitbonus1/conclusion.md

# Conclusion

Congratulations on finishing this bonus unit! We hope you liked it and tried some of the algorithms!

We would like to thank again Dr Tommy Thompson on sharing his videos and writing introduction for each of them for the course. Additionally, we would like to thank Mark Brown for sharing their videos with us.

Keep Learning, Stay Awesome 🤗

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unitbonus1/conclusion.mdx" />

### Classical AI in Games [[classical-ai-in-games]]
https://huggingface.co/learn/ml-games-course/unitbonus1/introduction.md

# Classical AI in Games [[classical-ai-in-games]]

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/bonus-unit1/thumbnail.png" alt="Classical AI in Games thumbnail" width="100%"/>

The ultimate goal of any video game is to be an **immersive experience**.

To achieve this, the NPC (Non-Playable Characters) must **look smart**: they should smoothly **navigate the map** and **make logical decisions**. This is crucial to contribute **to the illusion of a coherent and believable world**.

To accomplish this goal, the gaming industry developed multiple algorithm designs such as Navigation Mesh, Behavior Tree, Goal Action Oriented Planning and more.

Learning these tools is critical to work on AI in games.

Therefore, in this bonus unit, we'll study classical AI in games. You'll learn about Navigation Meshes, Behavior Trees, Goal Action Oriented Planning, Finite State Machines and more.

In the first section, **you'll learn the theory behind these algorithms, how they are used in games, and then, in the following sections, you'll learn to implement these with Unreal or Unity**.

So let's get started 🚀.


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unitbonus1/introduction.mdx" />

### Additional readings about the history of AI in Video Games
https://huggingface.co/learn/ml-games-course/unitbonus1/additional-readings.md

# Additional readings about the history of AI in Video Games

If you are interested in knowing more about how AI in video games evolved from the 50s into nowadays, to have a historical perspective,
we recommend the `AI in video games: a historical evolution, from Search Trees to LLMs` series by [Juan Martinez](https://www.linkedin.com/in/jjmcarrascosa/):

- [Chapter 1: 1950–1980.](https://medium.com/@jjmcarrascosa/ai-in-video-games-a-historical-evolution-from-search-trees-to-llms-chapter-1-1950-1980-f3b04d6e9dc8)
- [Chapter 2: 1980-2000](https://medium.com/@jjmcarrascosa/ai-in-video-games-a-historical-evolution-from-search-trees-to-llms-chapter-2-1980-2000-341bc31860d9)
- [Chapter 3: 2000+](https://medium.com/@jjmcarrascosa/ai-in-video-games-a-historical-evolution-from-search-trees-to-llms-chapter-3-2000-2023-ae286c975387)



<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unitbonus1/additional-readings.mdx" />

### Classical AI in Unreal Engine
https://huggingface.co/learn/ml-games-course/unitbonus1/ai-in-unreal.md

# Classical AI in Unreal Engine

Unreal Engine provides some **powerful tools and documentation to create AI in Games**.

The best content you can get from Unreal AI is clearly their documentation:
- [Navigation Meshes](https://docs.unrealengine.com/5.3/en-US/navigation-system-in-unreal-engine/)
- [Behavior Trees](https://docs.unrealengine.com/5.3/en-US/behavior-trees-in-unreal-engine/)
- [State Trees](https://docs.unrealengine.com/5.3/en-US/state-tree-in-unreal-engine)
- [Environment Query System](https://docs.unrealengine.com/5.3/en-US/environment-query-system-in-unreal-engine/)
- [AI Perception](https://docs.unrealengine.com/5.3/en-US/ai-perception-in-unreal-engine)
- [MassEntity](https://docs.unrealengine.com/5.3/en-US/mass-entity-in-unreal-engine)
- [Smart Objects](https://docs.unrealengine.com/5.3/en-US/smart-objects-in-unreal-engine)

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unitbonus1/ai-in-unreal.mdx" />

### AI 101 by AI and Games (Dr. Tommy Thompson)
https://huggingface.co/learn/ml-games-course/unitbonus1/ai-and-games.md

# AI 101 by AI and Games (Dr. Tommy Thompson)

To better understand the theory behind classical AI in Games, the best resources to watch are the excellent **series of videos by Dr. Tommy Thompson on AI in Games called AI 101**.

[Dr Tommy Thompson](https://twitter.com/AIandGames) works as an AI developer, researcher and consultant in the video game industry. In his [YouTube channel, AI and Games](https://www.youtube.com/@AIandGames), he studies what AI algorithms are used in games and how.

We would like to thank Dr. Tommy Thompson for providing detailed introductions for each video in this section.


## Chapter 1: How AI is Actually Used in the Video Games Industry (2024) 🤖

When we talk about 'AI in video games' nowadays, **what do we really mean?** Game AI? Deep Learning? Generative AI?  All of these are valid, **but understanding how and where they're adopted is super important**. In this episode we provide a high-level overview of how artificial intelligence is adopted across the industry.

<iframe width="560" height="315" src="https://www.youtube.com/embed/j3LW5no-5Ao?si=D34_eCOL8QlW-TTB" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

## Chapter 2: Why is It Difficult to Make Good AI for Games? 🤔

If you've build AI for your games (or simply played a lot of them too), **you'll notice how difficult it is to make AI really shine in your game**. In this episode we dig into some of the fundamental challenges faced by developers.

<iframe width="560" height="315" src="https://www.youtube.com/embed/qCkqpRnk1oU?si=HVC3Got61oztHhqJ" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>


## Chapter 3: Navigation Meshes 🗺️

As games moved into 3D, one of the biggest challenges was how to ensure a non-player character (NPC) could successfully move from one place to another.  The solution is what is typically referred to as a 'navigation mesh': a data structure that **tessellates a surface with a collection of polygons indicate of navigable space**. This concept, popularised by *Quake III Arena* in 1999 has since become a **standard tool in all 3D commercial game engines**. Despite this, **successful navigation is still a challenge for modern games** to this day given the myriad of design challenges it presents.

<iframe width="560" height="315" src="https://www.youtube.com/embed/U5MTIh_KyBc?si=S26zVmHvbLwC4Lty" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

📚 Additional Readings:

- Theory and Overview of Navigation Meshes
	- https://en.wikipedia.org/wiki/Navigation_mesh
	- https://www.gamedev.net/articles/programming/artificial-intelligence/navigation-meshes-and-pathfinding-r4880/

- Nav Meshes in Unity
	- https://www.youtube.com/watch?v=CHV1ymlw-P8&ab_channel=Brackeys
  
- Nav Meshes in Unreal Engine 5
	- https://docs.unrealengine.com/5.3/en-US/basic-navigation-in-unreal-engine/


## Chapter 4: Behavior Trees 🌳

While there are a myriad of approaches for controlling NPCs in games - many of which earn their own episodes in this series - **the standard is Behaviour Trees**. These relatively straightforward data structures were originally popularised by the *Halo* franchise and have since become the norm for many large-scale big budget productions **and are the default AI tool in Unreal Engine**.

<iframe width="560" height="315" src="https://www.youtube.com/embed/6VBCXvfNlCM?si=Vc35IYS10A2vF4eX" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>


📚 Additional Readings:

- Behavior Trees in Unreal: https://docs.unrealengine.com/5.3/en-US/behavior-trees-in-unreal-engine/


## Chapter 5: Finite State Machines 🤖

*Finite State Machines* (FSMs) have existed since the very beginning not just of AI for video games, but the field of artificial intelligence as a whole. 

A simple system of connected 'states' that allows for designers **to encapsulate how a character should operate in specific contexts**. Many of your favourite games from the 1980s and 1990s adopted FSMs, from the arcade days of *Pac-Man* to the birth of first-person shooters with *DOOM*.  

While they're still used today in popular franchises like *Marvel's Spider-Man* and *The Last of Us*, this is largely thanks to the implementation in Valve's *Half-Life* which **set a new standard for their adoption in games**.    

<iframe width="560" height="315" src="https://www.youtube.com/embed/JyF0oyarz4U?si=G-uM3ImtlK3Fr0Oe" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>


📚 Additional Readings:

- Finite-State Machines: Theory and Implementation: https://code.tutsplus.com/finite-state-machines-theory-and-implementation--gamedev-11867t


## Chapter 6: Goal Oriented Action Planning 📝

*Goal Oriented Action Planning* was one of the **most important innovations in AI for games in the mid 2000s** and is arguably what led to Monolith's first-person shooter *F.E.A.R*. still being considered to have the best AI in video games.  But it's really quite simple: **the combination of finite state machines with AI planning technology that dates back as far as the 1970s.** 

<iframe width="560" height="315" src="https://www.youtube.com/embed/PaOLBOuyswI?si=ELDIk11nEy8MpqTe&amp;start=1" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

📚 Additional Readings:

- Holistic 3D's video on GOAP: https://www.youtube.com/watch?v=jUSrVF8mve4&ab_channel=Holistic3D


## Chapter 7: Director AI for Balancing In-Game Experiences 📣

So far the topics have focused on AI that is built **to control non-player characters**, but what if we want to use AI not just for individual behaviours **but also designing the player experience**.

This has led to a collection of approaches collectively known as *Director AI*, with the term adopted for this design philosophy derived from its use in Valve's *Left 4 Dead*.

<iframe width="560" height="315" src="https://www.youtube.com/embed/Mnt5zxb8W0Y?si=Yqa7tnFx5d75LCVg" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>


## Chapter 8: Utility AI Helps NPCs Decide What To Do Next 📈

Sometimes an AI system could have a collection of valid options to select from, **but it can't tell which is the best for this context**.  

While we could now train a model using deep learning to determine contextual validity, **we have long since used utility AI as a means to embed designer knowledge to determine the validity of an action or goal**.

<iframe width="560" height="315" src="https://www.youtube.com/embed/p3Jbp2cZg3Q?si=5kqN3FqnFW277BTc" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>


📚 Additional Readings:


- [How the AI of The Sims Works](https://youtu.be/9gf2MT-IOsg?si=omZdMpayS5-qQJkE)
- ["An Introduction to Utility AI Theory" - Game AI Pro v2](https://www.gameaipro.com/GameAIPro/GameAIPro_Chapter09_An_Introduction_to_Utility_Theory.pdf)
- [Behavior Decision System: Dragon Age Inquisition’s Utility Scoring Architecture" - Game AI Pro v3](https://www.gameaipro.com/GameAIPro3/GameAIPro3_Chapter31_Behavior_Decision_System_Dragon_Age_Inquisition%E2%80%99s_Utility_Scoring_Architecture.pdf)


## Chapter 9: How Barks Make Videogame NPCs Look Smarter 🗣️

While building AI for games is difficult, communicating the inner workings of these systems to players is just as demanding.  One of the simplest solutions to this problem is the use of *barks*: **pre-built text and/or audio that helps communicate the state of an AI agent in ways that make sense in the context of play**.

That might sound pretty simple, **but you'd be surprised how effective it is in practice**.

<iframe width="560" height="315" src="https://www.youtube.com/embed/u9VkW18IMzc?si=-5yovJKq2OCeOkf-&amp;start=2" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

📚 Additional Readings:

- [Interview with Supergiant on the dialogue of Hades](https://www.youtube.com/watch?v=bwdYL0KFA_U&t=0s)
- [Elan Ruskin on Fuzzy Pattern Matching for Dialogue](https://www.youtube.com/watch?v=tAbBID3N64A&t=0s)
- [Darren Korb and Greg Kasavin on the dialogue systems of Hades](https://www.youtube.com/watch?v=m5KJSAj4afg&t=0s)
- [Jason Gregory on context-aware dialogue in The Last of Us](https://www.gdcvault.com/play/1020386/A-Context-Aware-Character-Dialog)

Now that we studed the theory behind these classical AI in Games algorithms. Let's **learn how to implement some of them in Unity and Unreal**.


## Chapter 10: How Machine Learning is Transforming the Video Games Industry 🤖

As you will have gathered watching these videos, machine learning has not been as prominent in the adoption of AI for non-player characters when compared to symbolic methods. 

In this episode we dig into not just how it has been used in player facing systems, but also the interesting problem spaces it has been applied to in video games production.

<iframe width="560" height="315" src="https://www.youtube.com/embed/dm_yY-hddvE?si=ChqK_aCtOrV1qXq9" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>







<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unitbonus1/ai-and-games.mdx" />

### The power of Sentence Similarity 🤖
https://huggingface.co/learn/ml-games-course/unit1/sentence-similarity-explained.md

# The power of Sentence Similarity 🤖

Before diving into making the demo, we must **understand sentence similarity and how it works**.

## How does this game work?

In this game, we want to **give more freedom to the player**: instead of giving an order to a robot by just clicking a button, we want them to interact with it through text.

The robot has a list of actions and **uses a sentence similarity model that selects the closest action** (if any) given the player's order.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/sentence-similarity.jpg" alt="Sentence Similarity Pipeline"/>

For instance, if I write, “Hey, grab me the red box”, **the robot isn't programmed to know what “Hey, grab me the red box” is**. But the sentence similarity model connected this order with the “bring red box” action we programmed for the robot.

Therefore, thanks to this technique, **we can build believable character AI without having the tedious process of mapping every possible player input interaction to robot response by hand**. We let the sentence similarity model do the job for us!

## What is Sentence Similarity?

Sentence Similarity is a task that, given a source sentence and a set of target sentences, **calculates how similar the target sentences are to the source**.

<figure>
<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/sentence-similarity-example.jpg" alt="Sentence Similarity Example"/>
<figcaption>Source: https://huggingface.co/tasks/sentence-similarity</figcaption>
</figure>

For instance, if our source sentence (player order) is “Hey grab me the red box”, it’s very close to the “Bring red box” sentence in the action list.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/sentence-similarity.jpg" alt="Sentence Similarity Pipeline"/>


Sentence similarity models convert input text, like “Hello”, into vectors (called embeddings) that capture semantic information. We call this step to embed. Then, we calculate how close (similar) they are using **cosine similarity**.

I'll not go into the details, but the essence is: since our sentence similarity models produce vectors, we can calculate **the cosine of the angle between two vectors. The closer the result is to 1, the more similar these two vectors are.**

<figure>
<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/cosine.png" alt="Cosine similarity"/>
<figcaption> Source: https://zhangruochi.com/Operations-on-word-vectors-Debiasing/2019/03/28/</figcaption>
</figure>

If you want to dive deeper into the sentence similarity task, check this 👉 https://huggingface.co/tasks/sentence-similarity

## The Complete pipeline

Now that we understand Sentence Similarity, **let's see the entire pipeline: from the moment a player inputs an order to the robot acting**.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/jammo-process.jpg" alt="Jammo Process"/>

1. The player types an order: "Can you bring me the red cube?"

2. The robot has a list of actions ["Hello", "Happy", "Bring red box", "Move to blue pillar"]

3. What we want to do then is to embed this player input text **to compare to the robot action list to find the most similar action (if any).**

4. To do that **we tokenize the input**: a Transformer model can't take a string as input. **It needs to be translated into numbers.** This is done using the Tokenizer code provided by [Sharp Transformers](https://github.com/huggingface/sharp-transformers).

5. Then, the input (tokenized) is passed to the model **that outputs an embedding of this text**: a vector that captures semantic information about the text. This inference part is done by Unity Sentis.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/jammo-process-1.jpg" alt="Jammo process part 1"/>

6. Now we can compare **this vector with other vectors (from the action list) using cosine similarity.**

7. We select the action with the **highest similarity and get the similarity score.**

If the similarity score > 0.2, **we ask the robot to perform the action.**

Otherwise, we ask the robot to perform the "I'm perplexed" animation. Since the order given **is too different from the action list** (it can be the case if the player for instance writes something totally irrelevant like "do you like rabbits?".

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/rabbits.gif" alt="Jammo rabbits"/>

### Why 0.2 for the similarity score threshold?

This is a threshold value: if the similarity score is below it, **we consider the player's order too different from the possible robot action list**.

It was found by **testing** different threshold scores.

More importantly, this **not a fixed threshold** since the similarity score of all the possible robot actions sums to 1. You must decrease this threshold **if you add more action to the robot action list**.

Now that we understand the whole pipeline, you might wonder **how to run this AI model.** In the player's machine or remotely? And what are the differences. That's the topic of the next section.

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit1/sentence-similarity-explained.mdx" />

### How to run an AI model: local vs remote
https://huggingface.co/learn/ml-games-course/unit1/local-vs-api.md

# How to run an AI model: local vs remote

In this game, we want to run a sentence similarity model, I'm going to use [all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2).

It's a BERT Transformer model. It's **already trained so we can use it directly.**

But here, I have two solutions to run it, I can:

- *Run this AI model remotely*: **on a server**. I send API calls and get responses from the server. That requires an internet connection.
- *Run this AI model locally*: **on the player's machine.**

Both are valid solutions, **but they have advantages and disadvantages**.


## Running the model remotely

I run the model **on a remote server**, and send API calls from the game. I can use an API service to help deploy the model.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/remote.jpg" alt="Running AI model remotely"/>

For instance, Hugging Face provides an API service called [Inference API](https://huggingface.co/inference-api) (free for prototyping and experimentation) that allows you to use AI models via simple API calls. And we have a [Unity plugin](https://huggingface.co/blog/unity-api) to access and use Hugging Face AI models from within Unity projects.


## Advantages

- You're not using the RAM/VRAM of your player to run the model.
- Your server can log the data, so you can understand what actions players usually type and thus you can improve your NPC. 

## Disadvantages

- **Dependence on an internet connection**, risking immersion disruption due to potential API lag.
- **Potential high cost associated with API usage**, especially with many players.


Usually, you use an API if you use a very big model that couldn't run on a player machine. For instance if you use big models like Llama 2.



## Running the model locally

I run the model locally: on the player machine. To be able to do that I use two libraries.

1. [Unity Sentis](https://docs.unity3d.com/Packages/com.unity.sentis@latest): the neural network inference library that allow us to run our AI model directly inside our game.

2. [The Hugging Face Sharp Transformers library](https://github.com/huggingface/sharp-transformers): a Unity plugin of utilities to run Transformer 🤗 models in Unity games.


<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/local.jpg" alt="Running AI model locally"/>


### Advantages

- You don't have usage cost since everything runs on the player's computer.
- The player does not need to be connected to the Internet.


### Disadvantages

- You use the RAM/VRAM of the player so you need to specify spec recommendations 
- You can't easily know how people use the game or the model.



Since the sentence similarity model we're going to use is small, **we decided to run it locally**.


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit1/local-vs-api.mdx" />

### What can you do now?
https://huggingface.co/learn/ml-games-course/unit1/next-steps.md

# What can you do now?

Now that you have improved this demo by adding new actions, here are some ideas of what you can do with it.

## Share your demo easily for free on [Hugging Face Spaces](https://huggingface.co/spaces)

- This is the **best way to showcase your work to a broader audience**.

- We wrote a **5min tutorial to help you publish your demo** 👉 https://huggingface.co/blog/unity-in-spaces

## Adding Speech-To-Text Functionality

- Why not try to **command your Robot NPC with your voice** using the Whisper model for Speech-To-Text functionality?

- Check out our tutorial 👉 [Building AI-Driven Voice Recognition](https://thomassimonini.substack.com/p/building-ai-driven-voice-recognition)

- Or try out the Whisper-powered demo 👉 [here (Windows)](https://singularite.itch.io/jammo-the-robot-with-unity-sentis-whisper-version) 

## Use what you learned for your game demo

- Get creative and **incorporate the Sentence Similarity model into another game project!**

- Share your progress and creations on Twitter, LinkedIn, and our Discord server.
Don't forget to tag me (@thomassimonini on Twitter and Thomas Simonini on LinkedIn) so I can see your amazing work!

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit1/next-steps.mdx" />

### What is Hugging Face 🤗?
https://huggingface.co/learn/ml-games-course/unit1/what-is-hf.md

# What is Hugging Face 🤗?

Hugging Face is a platform where users can **upload, find and download pre-trained AI models in more than 35 different tasks from text generation with Llama v2, speech-to-text with Whisper, text-to-speech (AI voices) with Coqui to 3D generation**.

For many of these models, Hugging Face also provides a service called Inference API that allows you to use them via simple API calls. This service is free for prototyping and experimentation, but is subject to rate limiting. There are also paid inference services for production workloads.

## So how can I select a Sentence Similarity Model on Hugging Face 🤗?

All the AI models are listed here https://huggingface.co/models. On the left of this page, you have the differents tasks.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/models1.jpg" alt="HF models 1"/>


In our case you want to select Sentence Similarity.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/models2.jpg" alt="HF models 2"/>

Now we have a list of Sentence Similarity models (approximately 3,000).

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/models3.jpg" alt="HF models 3"/>

If you click on one of them, you'll usually see a widget, that allows you to **try the model directly on your browser**. This way you can test and see if it fits your needs.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/models4.jpg" alt="HF models 4"/>


Now that we understood Sentence Similarity, the difference between local and remote model inference, and what's Hugging Face, we're ready to make our demo 🤖!

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit1/what-is-hf.mdx" />

### Conclusion
https://huggingface.co/learn/ml-games-course/unit1/conclusion.md

# Conclusion

Congratulations on finishing this first unit! **With a few steps we’ve just built a robot that’s able to perform actions based on your orders that’s amazing!**

What you can do next is dive deeper into the code in Scripts/Final and look at the final scene in Scenes/Final. I commented on every part of it so it should be relatively straightforward.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/jammo-dance.gif" alt="Jammo dance" />

In the next Unit, we're going to **help you define and polish your game idea and write your Game Design Document 🤩**.

Keep Learning, Stay Awesome 🤗



<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit1/conclusion.mdx" />

### Introduction
https://huggingface.co/learn/ml-games-course/unit1/introduction.md

# Introduction

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/thumbnail.png" alt="Thumbnail" />

Welcome to the first Unit of the course! 

The integration of cutting-edge AI models in video games is opening a whole range of new exciting gameplays. One of them is the ability for Non-Player Characters (NPCs) **to understand and respond to the player’s voice or text inputs**. And this is what we’re going to do today.

Indeed, in this Unit, **you're going to integrate your first AI model in your Unity game and make it run locally**.

You're going to learn:

- What is Sentence Similarity?
- The difference between **running an AI model locally or remotely** (using an API).
- What the [Hugging Face Hub 🤗](https://huggingface.co/models?library=unity-sentis) is.
- How to run an AI model locally with [Unity Sentis](https://docs.unity3d.com/Packages/com.unity.sentis@latest) and [Sharp Transformers](https://github.com/huggingface/sharp-transformers).

And you're going to make this demo, where a **smart robot that can understand your orders and perform them**.

<iframe width="560" height="315" src="https://www.youtube.com/embed/L8LlOlYc3wI?si=ZILzXsuLwl__fYbx" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

You can download the Windows demo 👉 [here](https://singularite.itch.io/jammo-the-robot-with-unity-sentis)

To make this project, we’re going to use:

- Unity Game Engine (2022.3 and +).

- The Jammo Robot asset made by [Mix and Jam](https://www.youtube.com/channel/UCLyVUwlB_Hahir_VsKkGPIA).

- [Unity Sentis library](https://docs.unity3d.com/Packages/com.unity.sentis@latest), the neural network inference library that allow us to **run our AI model directly inside our game**.

- The [Hugging Face Sharp Transformers](https://github.com/huggingface/sharp-transformers)): a Unity plugin of utilities to **run Transformer 🤗 models in Unity games**.

You can download the complete Unity Project by clicking 👉 [here](https://huggingface.co/datasets/huggingface-ml-4-games-course/unity-demos/resolve/main/unit1/Jammo%20the%20Robot%20Sentis%20(v2).zip?download=true)

At the end of the project, you’ll build your own intelligent robot game demo. 


And then, you’ll be able **to iterate with other ideas**:


For instance, after making this game, I created this Dungeon Escape demo ⚔️ with the same codebase, **where your goal is to flee from this jail by stealing the 🔑 and the gold without getting noticed by the guard**.


<iframe width="560" height="315" src="https://www.youtube.com/embed/h8Mq1ZJkFL0?si=4q21QH6PIqNeE70w" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>


But I also worked on a stealth game, where you **guide your character to sneak in a party and steal some stuff**.

<iframe width="560" height="315" src="https://www.youtube.com/embed/mU6ih2BBAus?si=Vh1XQoFVbEvuCtYb" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

You'll be also able to improve the demo, [by adding a Speech To Text model to give order to the robot with your voice!](https://thomassimonini.substack.com/p/building-ai-driven-voice-recognition)

<iframe width="560" height="315" src="https://www.youtube.com/embed/TzHTLW2zOlY?si=LekdMO6vp2_qM-aR" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>


Sounds fun? Let’s get started!


<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/jammo-dance.gif" alt="Jammo dance" />

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit1/introduction.mdx" />

### Let's build our smart robot NPC demo 🤖
https://huggingface.co/learn/ml-games-course/unit1/make-demo.md

# Let's build our smart robot NPC demo 🤖 

## Step 0: Get the project files

You can find the 👉 [complete Unity project here](https://huggingface.co/datasets/huggingface-ml-4-games-course/unity-demos/resolve/main/unit1/Jammo%20the%20Robot%20Sentis%20(v2).zip?download=true)

## Step 1: Install Unity Sentis

The Sentis Documentation 👉 https://docs.unity3d.com/Packages/com.unity.sentis@latest

1. Open the Jammo project

2. Click [Sentis Pre-Release package](https://tinyurl.com/4eun48fb) or go to Window > Package Manager, click the + icon, select "Add package by name…" and type "com.unity.sentis"

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/sentis1.jpg" alt="Sentis"/>

**Set the version to 1.3.0**

3. Press the Add button to install the package.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/sentis2.png" alt="Sentis"/>


## Step 2: Install Sharp Transformers 💪

[Sharp Transformers](https://github.com/huggingface/sharp-transformers) is a Unity plugin of utilities to **run Transformer 🤗 models in Unity games**.

We need to do it for the tokenization step.

Go to "Window" > "Package Manager" to open the Package Manager.

Click the "+" in the upper left-hand corner and select "Add package from git URL".

Enter the URL of this repository and click "Add": [https://github.com/huggingface/sharp-transformers.git](https://github.com/huggingface/sharp-transformers.git)


## Step 3: Build the inference process 🧠

As described in [Sentis documentation](https://docs.unity3d.com/Packages/com.unity.sentis@latest), to use Sentis to run a neural network in Unity, we need to follow these steps.

1. Use the Unity.Sentis namespace.

2. **Load a neural network** model file.

3. **Create input** for the model.

4. **Create an inference engine** (a worker).

5. **Run the model** with the input to infer a result.

6. **Get the result**.

In our case, we do all of this in **SentenceSimilarity.cs** file, that will be attached to our robot.

In awake, we:

1. **Load our neural network.**

2. Create an **inference engine** (a worker).

3. **Create an operator**, that will allow us to perform operations with tensors.


<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/sentis-similarity1.jpg" alt="Sentis Similarity"/>

We have three functions:

1. *Encode*: takes our player input (text), **tokenizes it, and embeds it**.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/sentis-similarity2.jpg" alt="Sentis Similarity"/>

2. *SentenceSimilarityScores*: **calculate the similarity scores** between the input embed (what the user typed) and the comparison embeds (the robot action list)

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/sentis-similarity3.jpg" alt="Sentis Similarity"/>

3. *RankSimilarityScores*: get the **most similar action** and its index given the player input

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/sentis-similarity4.jpg" alt="Sentis Similarity"/>

## Step 4: Build the Robot Behavior 🤖

We need to define the **behavior of our robot**.

The idea is that our robot has different possible actions and the choice of the actions **will depend on most similar actions**.

We need first to define the *Finite State Machine*, a simple **classical AI technique where each State defines a certain behavior**.

Then, we’ll make the utility function **that will select the State hence the series of actions to perform**.

### The State Machine 🧠🤖

In a state machine, **each state represents a behavior**, for instance, moving to a column, saying hello, etc. Based on the state the agent is **it will perform a series of actions**.

In our case, we have 7 states:

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/fsm.jpg" alt="Jammo State Machine"/>

The first thing we need to do is create an enum called State that contains each of the possible States:

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/fsm2.jpg" alt="Jammo State Machine"/>

Because we need to constantly check the state, **we define the state machine into the Update() method using a switch system where each case is a state**.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/fsm3.jpg" alt="Jammo State Machine"/>

For each state case, we define the behavior of our agents, for instance in our state Hello, the robot **must move towards the player, face him correctly, and then launch its Hello animation, then go back to an Idle State**.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/fsm4.jpg" alt="Jammo State Machine"/>

We have now defined the behavior for each different State. The magic here will come from the fact **that’s the language model that will define what State is the closest to the Player input**. And in the utility function, we call this state.


### Let’s define the Utility Function 📈

Our action list looks like this:

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/utility1.jpg" alt="Utility"/>

- The sentence is what will be embedded in the AI model.

- The verb is the State

- Noun (if any) is the object to interact with (Pillar, Cube, etc.)

This utility function will **select the Verb and Noun associated with the sentence having the highest similarity score with the player input text.**

But first, to **get rid of a lot of strange input text**, we need to have a similarity **score threshold**.

For example, if I say “Look at all the rabbits”, **none of our possible actions are relevant**. Hence instead of choosing the action with the highest score, we’ll call the State Puzzled which will animate the robot with a perplexed animation. 

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/rabbits.gif" alt="Jammo rabbits"/>

If the score is higher, then **we’ll get the verb corresponding to a State and the noun (goalObject) if any**.

We set the state corresponding to the verb. That will activate the behavior corresponding to it.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/utility2.jpg" alt="Utility"/>

And that’s it, **now we’re ready to interact with our robot!**


## Step 5: Let’s interact with our Robot 🤖

In this step, you just need to click on the play button in the editor. And you **can prompt some orders and see the results**!


That's nice! But the best way to learn is to try things, break things and modify the demo.


## (Optional) Improving the game: let’s add a new action

Adding a new action is quite easy. Let’s take an example:

- Copy the YellowPillar game object and move it

- Change the name to GreenPillar

- Create a new material and set it to green

- Place the material on GreenPillar

Now that we’ve placed the new game object, we need to add this possibility into the sentences and click on Jammo_Player.

In the list of actions click on the plus button and fill in this new action item:

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit1/unity/optional-jammo1.jpg" alt="Utility"/>

- Add Go to the green column
- GoTo
- GreenColumn

And that’s it! **You can easily iterate and add more actions to your game**.

Again, don't forget to test the similarity threshold, **to see if you need to increase it or decrease it**.


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit1/make-demo.mdx" />

### Make your own demo 🔥
https://huggingface.co/learn/ml-games-course/unit3/customize.md

#  Make your own demo 🔥

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit3/owndemo.png" alt="demo"/>

Now that you've learned how the demo works, **it's time to make your own version**!

We **can't wait to see the amazing demos you're going to make 🔥**. 

You can:

- Share your demo on LinkedIn and X, and tag us  @cubzh_ @gigax @huggingface **we'll repost it** 🤗. 

- But also share it on the [Hugging Face Discord server in the #ml-4-games-i-made-this channel](https://hf.co/join/discord).

## Teach your NPC new skills

For this part of the unit, it's [Tristan](https://x.com/tr_deborde), the founder of Gigax, that made a video tutorial where you're going to teach a NPC new skills 🔥.

**The name of the Space changed, it's not ai-npc anymore but NPC-Playground.**

<video width="1280" height="720" controls="true" src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit3/gigax_cubzh_hf_full.mp4">
</video>


## Bonus: find a funny hat

In addition to being able to teach your NPCs new skills, you can customize the assets you want to use in your demo thanks to a free library of 25k assets.

For instance, I want the NPC to give me a hat. For now, they give me a party hat when I ask them if they have a party hat for me.

But there's so many more funny hats to try.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit3/hats.png" alt="Cubzh Hats" />

To do that, I selected 6 funny hats you can replace with:

- pratamacam.chicken_hat: Chicken Hat
- claire.sombrero: Sombrero
- exoticwilper.hat: Bear hat 
- creativitial.cap: Child cap
- claire.napoleon_hat: Napoleon Hat
- claire.cowboy_hat: Cowboy hat


It's quite simple, the skill we want to customize is GIVEHAT skill.

You need to go to Files and click on cubzh.lua


<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit3/partyhat1.png" alt="Cubzh tutorial" />

Then, we go to the `GIVEHAT` skill, here you can see what object is loaded, for now it's a party hat. The name of an object in Cubzh is `creator_name.item_name`.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit3/partyhat2.png" alt="Cubzh Hats" />

You can replace it with one of the 6 hats we provided.

Now, you just need to save, go back to your demo and talk to an NPC, ask them to put you a hat in your head and voila!


<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit3/partyhat3.png" alt="Cubzh Hats" />



## Challenge: create new skills


[You can check the documentation to learn more](https://huggingface.co/spaces/cubzh/ai-npcs/blob/main/README.md) on how to tweak NPC behavior and teach NPCs new skills.




<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit3/customize.mdx" />

### A deep dive on the NPC-Playground
https://huggingface.co/learn/ml-games-course/unit3/demo.md

# A deep dive on the NPC-Playground

## The Tech Stack

To create this demo, the teams used three main tools:

- [Cubzh](https://github.com/cubzh/cubzh): the cross-platform UGC (User Generated Content) game engine.

- [Gigax](https://github.com/GigaxGames/gigax): the engine for smart NPCs.

- [Hugging Face Spaces](https://huggingface.co/spaces): the most convenient online environment to host and iterate on game concepts in an open-source fashion.

## What is Cubzh?

[Cubzh](https://github.com/cubzh/cubzh) is a cross-platform UGC game engine, that aims to provide an open-source alternative to Roblox.

It offers a **rich gaming environment where users can create their own game experiences and play with friends**.

<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/181_npc-gigax-cubzh/gigax.gif" alt="Cubzh"/>

In Cubzh, you can:

- **Create your own worlds items and avatars**.

- Build fast, using community **made voxel items** (+25K so far in the library) and **open-source Lua modules**.

- **Code games using a simple yet powerful Lua scripting API**.

Cubzh is in public Alpha. You can download and play Cubzh for free on Desktop via [Steam](https://store.steampowered.com/app/1386770/Cubzh_Open_Alpha/), [Epic Game Store](https://store.epicgames.com/en-US/p/cubzh-3cc767), or on Mobile via [Apple's App Store](https://apps.apple.com/th/app/cubzh/id1478257849), [Google Play Store](https://play.google.com/store/apps/details?id=com.voxowl.pcubes.android&hl=en&gl=US&pli=1) or even play directly from your [browser](https://app.cu.bzh/).

In this demo, Cubzh serves as the **game engine** running directly within a Hugging Face Space, users can easily clone it to experiment with custom scripts and NPC personas.


## What is Gigax?

[Gigax](https://github.com/GigaxGames/gigax) is the platform game developers use to run **LLM-powered NPCs at scale**.

Gigax has fine-tuned large language models for NPC interactions, **using the "function calling" principle.**

It's easier to think about this in terms of input/output flow:

- In **input**, the model reads [a text description](https://github.com/GigaxGames/gigax/blob/main/gigax/prompt.py) of a 3D scene, alongside a description of the recent events and a list of the NPC's available actions (e.g., `<say>`, `<jump>`, `<attack>`, etc.).

- The model then **outputs** one of these actions using parameters that refer to 3D entities that exist in the scene, e.g. `say NPC1 "Hello, Captain!"`.

<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/181_npc-gigax-cubzh/gigax.png" alt="gigax" />

Gigax has **open-sourced their stack!** 
You can clone their [inference stack on Github](https://github.com/GigaxGames/gigax).
For this demo, their models are hosted in the cloud, but you can [download them yourself on the 🤗 Hub](https://huggingface.co/Gigax):
- [Phi-3 fine-tuned model](https://huggingface.co/Gigax/NPC-LLM-3_8B)
- [Mistral-7B fine-tuned model](https://huggingface.co/Gigax/NPC-LLM-7B)


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit3/demo.mdx" />

### Conclusion
https://huggingface.co/learn/ml-games-course/unit3/conclusion.md

# Conclusion

Congratulations on finishing this unit!

We hope you enjoyed exploring NPC-Playground and experiencing the future of gaming with smart LLM-powered NPCs. 

Don't forget to:

- Share your demo on LinkedIn and X, and tag us  @cubzh_ @gigax @huggingface **we'll repost it** 🤗. 

- But also share it on the [Hugging Face Discord server in the #ml-4-games-i-made-this channel](https://hf.co/join/discord).

<video controls="true" src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit3/demo.mp4">
</video>

If you want to dive more on Cubzh and Gigax don’t hesitate to join their communities:

- [Cubzh Discord Server](https://discord.com/invite/cubzh) 
- [Gigax Discord Server](https://discord.gg/rRBSueTKXg)

As you continue your journey in this course, we encourage you to check the Cubzh documentation to create your own demos. 

Stay curious, keep experimenting, and most importantly, have fun!

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit3/cubzhcom.png" alt="Cubzh" />


Keep Learning, Stay Awesome 🤗

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit3/conclusion.mdx" />

### Introduction
https://huggingface.co/learn/ml-games-course/unit3/introduction.md

# Introduction

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit3/thumbnail.png" alt="Thumbnail" />

Welcome to the third unit of the course!

In this unit, we’ll **explore how LLM-powered NPCs can create immersive gaming experiences**. 

To illustrate this, first you'll dive into *NPC-Playground* created through the collaboration of the [Cubzh](https://github.com/cubzh/cubzh) and [Gigax](https://github.com/GigaxGames/gigax) teams, showcasing the potential of smart LLM-powered NPCs.

You can play with the demo directly on your browser 👉 [here](https://huggingface.co/spaces/cubzh/ai-npcs) 

In this 3D demo, you can **interact with the NPCs and teach them new skills with just a few lines of Lua scripting!**

<video controls="true" src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit3/demo.mp4">
</video>

Then, beyond just learning about the demo, **you’ll have the opportunity to customize it, tweak NPC behaviors and teach them new skills.**.

By customizing the game, **you’ll gain confidence in navigating the Cubzh documentation and building new, unique experiences**.

Sounds fun? Let’s get started!



<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/unit3/introduction.mdx" />

### Step 2: Let's write the Game Design Document ✍️
https://huggingface.co/learn/ml-games-course/demo1/game-design-document.md

# Step 2: Let's write the Game Design Document ✍️

## What is a Game Design Document (GDD)?

A Game Design Document (GDD) is an essential tool in the game industry: it's the **blueprint of a video game project**.

In this document, we outline the:
- Gameplay of the game, 
- Features
- Game mechanics
- Scope
- Assets
- And more.

The GDD **ensures a clear definition and understanding of the game's vision for all teammates** and facilitates the development process.

In the case of our demo, **we want to write a concise one-page document**.

## The One Page Game Design Document Template

This is our One Page GDD template, you can naturally add or remove elements based on your needs.

You can download the Google Doc version 👉 [here](https://docs.google.com/document/d/15xThlHwa8cA8AMFW6t29TeV2DwoNgjnBPOTc0bnaJ8E/edit?usp=sharing) 

### Game Title ✍️

- Brainstorm and define potential game titles.

### Team and roles 👯

- Clearly outline your team members and assign specific roles to each member.

### Game Concept 🤯

- Define your single sentence description of the game that **you will use to guide you during design decisions**.

### Game Genre 🏷️

- Clearly identify the genre of your game.

### Game Features and AI tools/model used 🤖

- List and describe the planned game features, along with any AI tools or models you intend to incorporate.

### Platform 🖥️

- On what platform do you want to publish it? Remember it needs to work either on HuggingFace Spaces (WEBGL) and/or Windows.

### Scope ⏲️

- Establish the expected playtime. We think a 10min maximum play is a good scope for this course.

### Art Style 🎨🖌️

- Determine the desired art style for your game'sgame's assets, such as low-poly, stylized, or realistic.

### Level 🖼️

- Define what your level will look like (that will help to know how much assets do you need).

### Assets 📦

- Define the assets you need to make this game. We'reWe're going to talk about assets and how to find them in the next demo unit.

### Todo 📝

- What needs to be done, think of this section for now as a brain dump to note everything that comes to your mind that need to be done in the game.


## A Game Design Document One Page Example: Museum Robery

To help you visualize better how this tool is useful, we're going to give an example below with a game called Museum Robery.

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/demo-unit1/roomba.png" alt="Museum Robery" width="100%"/>

### Game Title ✍️

- Museum Robery
- Roomba Heist

### Team and roles 👯

- Thomas Simonini

### Game Concept 🤯

You're a thief that controls with your voice a robot cleaner in a museum at night and your role is to steal the golden duck. A very expensive statue without getting caught by lasers and guards.

### Game Genre 🏷️

- Infiltration

### Game Features and AI tools/model used 🤖

- Command the robot using your own voice (ASR)

- The robot will understand what action to make (Sentence Similarity)

- The robot can steal stuff and break stuff

### Platform 🖥️

- WebGL

### Scope ⏲️

- 10min play
- 3 levels

### Art Style 🎨🖌️

- Low poly
- Using Unity Feel library for the nice touch additions

### Levels 🖼️

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/demo-unit1/museumrobery_level1.png" alt="Museum Robery GDD Level 1" width="100%"/>

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/demo-unit1/museumrobery_level2.png" alt="Museum Robery GDD Level 2" width="100%"/>

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/demo-unit1/museumrobery_level3.png" alt="Museum Robery GDD Level 3" width="100%"/>

## Assets 📦

- Floor
- Wall
- Half Wall
- Pedestal
- Statue 1
- Statue 2
- Statue 3
- Statue 4
- Laser Beam

- Paint
    - Paint 1
    - Paint 2
    - Paint 3
    - Paint 4
    - Paint 5
    - Paint 6
    - Couch
    - Enemy

- Player

- Number plaques
- Grid
- Round place
- Golden Duck

- Statue 1 break
- Statue 2 break
- Statue 3 break
- Statue 4 break

- + Textures for floor
- + Textures for wall

## Todo 📝

✅ Floor

✅ Wall

✅ Half Wall

✅ Pedestal

✅ Statue 1 (unicorn)

✅ Statue 2 (flamingo)

✅ Statue 3 (cake)

✅ Statue 4 (hot dog)

✅ Frame

✅ Couch

✅ Number plaques

✅ Grid

✅ Round place

- Paint
    - Paint 1
    - Paint 2
    - Paint 3
    - Paint 4
    - Paint 5
    - Paint 6
    - Enemy
- Laser Beam

### Step 2: Room 1

- Room 1 export assets
- Floor texture
- Floor prefab
- Wall texture
- Wall prefab
- Small wall texture
- Small wall prefab
- Install
- Navmesh
- Laser Beam
- Laser Beam movement

In addition to this, I like to write a list of **learning goals** when I make a demo, in this case my learning goals were:

### Learning Goals 🏆

- Make a game from beginning to the end
- Publish it
- Learn to use the Hugging Face API
- Learn better to use Navmesh
- Learn to use sound
- Learn to use animation and break
- Learn to use Feel



<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/demo1/game-design-document.mdx" />

### Conclusion
https://huggingface.co/learn/ml-games-course/demo1/conclusion.md

# Conclusion

Congratulations, **you now have an idea and a one-page GDD**.

You're now ready to start to work on your game demo.

In the next Demo Unit, **you'll learn how to find assets and how to reuse them**.

And if you want to go deeper into ideation, don't hesitate to check [this free course on making your game by USC](https://learn.unity.com/course/design-and-publish-your-original-game-unity-usc-games-unlocked)

Keep Learning, Stay Awesome 🤗

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/demo1/conclusion.mdx" />

### Let's define your Game idea 💡
https://huggingface.co/learn/ml-games-course/demo1/introduction.md

# Let's define your Game idea 💡

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/demo-unit1/thumbnail.png" alt="Deep RL Course thumbnail" width="100%"/>

> [!TIP]
> The course is now **self-paced and will not have new future units**. We **don't provide a Certificate of Completion for this course**.
> But we continue to write tutorials on how to use AI in Games here 👉 https://thomassimonini.substack.com/

Welcome to this exciting Unit, **where you'll define and polish your game idea and write your Game Design Document 🤩**.

As we explained during the introduction Unit, the goal at the end of this course is that **you'll build a game demo using either**:

- **AI tools to help you build the game** (texture generation, AI voice actors...)
- Using **AI in the game as gameplay or for the NPC**.

This course has three types of Units:

- *Course Unit*: learn a concept and apply it with a hands-on.
- *Demo Unit*: define and work on your demo.
- *Bonus Unit*: dive into advanced concepts such as classical AI in games or ML for 3D.

This Unit is a Demo Unit.

So, in this first Demo Unit, **you will define your game idea and write your Game Design Document (GDD)**. 

This GDD, a one-page document, will help you define the following:

- Gameplay mechanics
- The story (if any)
- Which AI model/tools you're going to use
- What assets you're going to need
- The scope
- The team (if any)


During the course, we'll emphasize the importance of **keeping your scope small** to be able to finish your demo.

For this section, **we curated the best resources you can find online**.

So let's get started!


<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/demo1/introduction.mdx" />

### Step 1: Crafting the Game Idea 💡
https://huggingface.co/learn/ml-games-course/demo1/game-idea.md

# Step 1: Crafting the Game Idea 💡

The first step to build a compelling demo is to **have a compelling game idea**. But how do you find the perfect one?

This video, from [AskGameDev](https://www.youtube.com/@AskGamedev) gives nice advice. In addition to them, we wrote ours below.

<iframe width="560" height="315" src="https://www.youtube.com/embed/fDqnw5pNebk?si=83Au99UbYkCmSSJ2" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

## Advice 1: Embrace Your Constraints

Innovation often **thrives within limitations**. Identify your current skill set and consider games that align with your abilities. 

For example, I'm currently developing a detective game demo where players must interview suspects to unveil the culprit. Because **I can't animate human face even with current tools (would take too much time and too expansive)** I decided that the game will use a phone system to interview suspects.

## Advice 2: Draw Inspiration from Your Favorites Games

Explore the games you love playing. **Is there a specific element within a game that captivates you?**

Maybe there's a small feature or mechanic you'd like to recreate in your own project.

## Advice 3: Think how you can add AI to improve the gameplay

Consider how incorporating AI elements can make your this small game feature or mechanic you like better?

## Advice 4: Beware of the Scope Curse

No you'll not build a Starfield clone. Keep your project manageable by **defining clear boundaries**. 

Your Game Demo needs to be very small: it's a working prototype, not a full game. Remember **it's better to get a working very small demo that you can iterate after the course than a big idea with awesome features that does not work**.

Don't hesitate to check this video:

<iframe width="560" height="315" src="https://www.youtube.com/embed/bh0F9m1O4gk?si=TKjyXGL1oqjuIvGE" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

## Advice 5: Brainstorm with HuggingChat

If you're stuck, why not typing some game ideas you have and **ask the help of [Hugging Chat](https://huggingface.co/chat/)**?

<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/demo-unit1/huggingchat.jpg" alt="Hugging Chat"/>


---
Now that **your idea is defined**, it's time to write the Game Design Document, which is your one page document **that will help you plan and manage the development of your game demo**.



<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/demo1/game-idea.mdx" />

### Is your first game? Watch these videos
https://huggingface.co/learn/ml-games-course/demo1/first-game.md

# Is your first game? Watch these videos

If it's your first game, **we strongly advise you to watch these four videos** from [ExtraCredits](https://www.youtube.com/@extrahistory) that talk about how to make your first game and advice.


## Making Your First Game: Basics - How To Start Your Game Development - Extra Credits

<iframe width="560" height="315" src="https://www.youtube.com/embed/z06QR-tz1_o?si=PQF13i0gfh5SRFM9" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>


## Making Your First Game: Practical Rules - Setting (and Keeping) Goals - Extra Credits

<iframe width="560" height="315" src="https://www.youtube.com/embed/dHMNeNapL1E?si=d6Xnm0rN2Wo9s7sN" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

## Making Your First Game: Minimum Viable Product - Scope Small, Start Right - Extra Credits

<iframe width="560" height="315" src="https://www.youtube.com/embed/UvCri1tqIxQ?si=wOC68ZU-23Ddr-wj" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

## Making Your First Game: Launching! - How to Market Your Game - Extra Credits

<iframe width="560" height="315" src="https://www.youtube.com/embed/qxsEimJ_3bM?si=jHWlgvQXi1kowvtD" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

## 🎁 Bonus video: GDC Talk: Crafting A Tiny Open World: A Short Hike Postmortem

<iframe width="560" height="315" src="https://www.youtube.com/embed/ZW8gWgpptI8?si=lcfdHzB4i_Vp7nqO" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

<EditOnGithub source="https://github.com/huggingface/making-games-with-ai-course/blob/main/units/en/demo1/first-game.mdx" />
