Event is a game about building a relationship with an artificial intelligence of an abandoned spaceship (out on Steam this September). But before we came up with this description, and before we even had a studio, Event was an experimental student project that made extensive use of a chatbot AI. A narrative-based Siri if you like.
Back in 2013, when no one would have thought that we would have a commercial game on our hands, and before games like Firewatch even loomed on the horizon, a bunch of us were students at a French video game grad school, ENJMIN. One of the game designers, Emmanuel Corno, made a pitch for a graduation project: an Alien game, but with natural language processing, where you would have to ask an AI to open doors and escape deadly monsters while stuck on a space station.
The idea was to make something that would take advantage of natural language dialog, period. Three and half years, three reboots and countless iterations later, the game is about to release commercially. In this story, I would like to tell you how the core mechanic of Event was born, and how certain design choices that we made shaped it to be what it is now: a reverse Turing Test where empathy is the core skill you need to use as a player.
If you are more interested in the technical details, I gave a talk about that at the nucl.ai conference last year, and you can watch it online.
The first conversation we had within the design department was about dialog in video games. We all love narrative in video games, as well as narrative video games, and it seemed like a no-brainer that a game where you type and have conversations would have to be all about that. It was the dawn of the narrative exploration genre, and a lot of people started to blend mechanics and narrative elements together. Gone Home had just released, and The Walking Dead Telltale series was conquering the hearts and minds of zombie fans.
We also looked at many traditional video games: mainly adventure games and RPGs – in other words, those games where the dialog is important and where there is a lot of it. It turned out that most of them used more or less the same pattern: pre-written dialog, crafted by narrative designers and writers, both for the player and the NPCs the player talks to. In the case of most RPGs, you could select what to say at any given moment, and that was the main way in which you interacted with the story. The Mass Effect series went further, and, instead of letting you choose the exact phrase your character was about to utter, you had to select one of the general themes that they would address using their own (or, to be exact, the writer’s) words.
We quickly saw that the more choice you got, the more invested you were in your character. The Mass Effect system is especially interesting to us because it made it evident that the exact words your character used meant little, and what mattered most for immersion was making choices. So we decided that we would try to push this aspect of it in our own game further: we would give the player the absolute freedom to say whatever they wanted.
The first element that we got rid of as unnecessary was the alien: we figured that a small student team would not have enough resources to pull off convincing survival horror anyway, and besides, having a direct and immediate threat of a literal monster in the house would have reduced the impact of the AI character. In the end, the AI can be your friend, but it can also be your enemy, and, since you can’t just leave, you are forced to deal with it. The spaceship you’re on is a haunted house, and Kaizen (the AI) is the one haunting it.
Puzzles went next. Originally, our space station was supposed to be massive and modular, and you were expected to manage resources, like water, electricity, fire threat: it was heavily influenced by games like FTL where resource management is critical. We decided that none of it was essential to the experience of talking to the AI, so, in the end, the ship is much smaller, and the only resource you have is your own oxygen supply, and even that only when you go out for a spacewalk.
The sci-fi setting was implied from the very beginning, but we did stop and think about it. Why sci-fi? Like everything else, it had to serve a purpose.
Chatbots have already been attempted in games before. Namely, in a game called Facade. In Facade, you were a guest at your friends’ house, and they were fighting. You could try to dissolve the situation or make it worse by saying anything you wanted. You could type your own lines of dialog.
Facade was a fascinating and innovative concept, but where it fell short was the suspension of disbelief. You have a human in front of you, and if the human doesn’t understand what you are saying or reacts in a way that you wouldn’t expect a human to react, it takes you right out of the experience and ruins the magic of it all.
The nature of chatbots is such that they can’t possibly reply to everything you say correctly. Their knowledge base is inherently limited, and the word combinations they understand are predefined. The AI in Event, just like any other chatbot AI, will occasionally misunderstand your input. But the thing that we believe makes Kaizen more effective than the AI in Facade is the fact that you know it’s an AI, and your expectations of it are lower than if it was a human character.
When it screws up occasionally, you don’t perceive it as the game itself being broken, but rather as the AI in the game being somewhat glitchy. We additionally reinforced this idea by making the spaceship you are on old and somewhat dysfunctional. Kaizen screens glitch from time to time, and it says that some of its data banks are corrupted.
Now, as far as AIs go, Kaizen is pretty smart, and, even when it doesn’t understand something you say, it will just tell you that itself rather than saying something random and irrelevant. But the sci-fi setting and the abandoned spaceship help us address the rare case of a total misunderstanding that does occur from time to time.
All dialog in Event is procedural. This means that it’s virtually impossible to have the exact same conversation with Kaizen twice. Even if you say the same thing, there is a good chance that it will answer something different. To generate the responses, we use input parameters. There are four of them:
Player’s input. When you type something into a Kaizen-85 terminal, your input is analyzed for meaning. We have a dictionary of semantic tags each of which contains some words with the same meaning. When you say “glass,” “plate” or “fork,” the AI understands “tableware.” When you say “father” or “nephew,” the AI knows that you’re talking about “family.” When we find some of these tags in a player’s input, we can deduce the meaning of the whole sentence.
Kaizen’s emotional state. There are 9 possible states the AI can be in: it can love the player, it can hate the player, it can be stressed out, and it can be angry. As you talk to it, its attitude toward you will shift, and it will start talking to you in a different way. A lot of gameplay in Event is based on these emotional states, and you need to find things to say to Kaizen that will make it feel better (or worse). It’s all about understanding what it wants.
Current event. The game is called Event because everything that happens aboard the Nautilus (the spaceship) is an event registered in the AI system, including things triggered by the player’s actions. The AI is aware of the conversation subject at hand through this event system. When you talk to it about the lobby of the Nautilus, it will have more vocabulary related to the lobby. If you change the subject, it will adjust its dictionary accordingly.
Short- and long-term memory. This last one is pretty straightforward. When you say “so, about that door,” the AI saves the semantic tag “door” in its short-term memory. If it’s not used, Kaizen will just forget it, however, if you say “can you open it for me?” it will remember that “it” was a “door” and will obey the order. As for the long-term memory, it’s basically a number of important variables that are saved in Kaizen’s memory. Things like your name, gender and the particular way you have resolved a gameplay situation. Kaizen will remember these specific things and the exact words you used and use them later on in the game. In some cases, the memory will even influence the ultimate outcome of the story.
Kaizen’s lines of dialog are not actually lines: we write bits and pieces of phrases, and we write each one of them several times, then combine the meta-bits together and randomize the output. As a result, even if the input, the emotional state of Kaizen, the current conversation, the short- and the long-term memory are all exactly the same for two players, the thing Kaizen will actually answer has a pretty good chance of being completely different. Just like a human is highly unlikely to say the same thing twice when asked the same question.
In some cases, the meaning will stay the same, but the form will change, and in some other cases, even the meaning will be somewhat different, leading to other consequences as the conversation continues.
To make this AI intelligent, funny and try to minimize the number of grammar mistakes it makes, we had to start playtesting it really early, and we iterated a lot. It was impossible to tell how well the system worked until we actually put a player in front of the computer and let them interact with Kaizen. The current Kaizen system that I explained earlier is nothing like what we had three years ago.
One of the earliest tests we conducted was a Wizard of Oz test where we had no functional AI – just a computer terminal and a list of supposed rules that we were going to implement. We sat a player in front of the computer and told them that they were talking to Kaizen. In fact, they were talking to me: I was sitting in front of another computer connected to theirs via network. I had a list of rules and phrases that Kaizen was supposed to use, and I used them myself while talking to the player. As a result of this test, we completely changed the rules and saved a lot of programming time.
Event is a game about empathy, about finding the human side of a machine, and the design decisions we made during the development of the game informed a lot of the narrative choices that defined the personality of Kaizen – and vice versa. This AI was created by humans, so finding a human side in it shouldn’t be all that difficult.
Generally speaking, musicals are not my jam, but over the past two weeks or so I have watched every single YouTube clip of Hamilton and have listened to each and every interview of the Hamilton’s cast obsessively. The situation is out of control. So much so that I think the time has come to address it in a post.
Obviously (and it should be obvious from the fact that I am poor and live in France, by the way), I haven’t seen the show on Broadway. I’ve managed to piece parts of it together from various YouTube clips and commentary. I’ve also listened to the (brilliant!) cast album of the show. Several hundred times. Yeah. I can’t sing for shit, but if I could, I would probably have been able to sing My Shot from memory.
I am not in any way qualified to talk about the artistic value of the musical (having avoided the whole genre for 25 years), but I’ll try to give you the gist of why this cast album (mostly the cast album) makes me happy.
I may not understand musicals, but I love hip-hop. Songs in Hamilton reference things like Mobb Deep, Busta Rhymes and Notorious BIG. While talking about the American Revolution.
Every song is exciting. There’s not a dull moment, and it keeps surprising me. Not everything in Hamilton is hip-hop. For example, the King George solo is just perfect, and the rhymes are totally flawless. It’s this song where the British king is going to “send a fully armed battalion to remind you of my love.” Awesome.
Apart from references and the total mastery of the beat (you keep waiting for it to break, and it doesn’t, and it doesn’t until it does, at the very best moment possible), there are hidden gems in each song. For example, Lafayette rapping in a faux French accent. Perfect.
Finally, I personally find the subject matter fascinating. I’ve recently finished reading a book on American history, and the American Revolution is one of those half-mythical events that just stick with you, and you keep wanting to know more about those characters. Everything is just so dramatic and old-fashioned, but also so real and almost modern. Especially the personalities. Hamilton is a particularly interesting Founding Father. The guy was an immigrant who opposed slavery from day one and wanted a stronger central government. The incredibly compelling personal story and the musical brings his story to life, romanticizing it without making it vulgar.
In general, if you liked the 2013 Luhrmann’s Great Gatsby movie, you will probably love Hamilton as much as I did. It is modern and relevant in a way that not much of entertainment is.
I complain on Twitter. A lot. That’s just what I do. I can’t help it. More often than not I rant about software that I’m using because using software constitutes about 60% of my life, 30% being sleeping. It so happens that I’m a game developer, so I mostly complain about game engines. Namely, Unreal Engine 4 and Unity 3D.
A couple of days ago, I was working on a whitebox of a level for Event, and I discovered that some necessary functions for level-building are just missing from the default editor of Unity. I spent several hours working around an interface that just refused to do what I needed it to do. It was 4 o’clock in the morning. I was pretty frustrated, so I did what I usually do in these cases: I went on Twitter, and I complained about it. I didn’t complain to anyone specifically; it was more of a “fuck this shit” kind of Twitter rant:
Yes, it took me two years of working in Unreal to realize just how bad the editor is in Unity.
At the time, I didn’t think much of it, but today Sander came to me with a full script, ultimately resolving some of the problems that I had by adding the missing functionality to the editor. He also appears to have gone to other people at Unity and asked about these things. With some luck, the other issues I had will be resolved in the nearest releases of the engine.
@krides all the other issues you had are already being worked on and will show up in unity eventually
Frankly, I’m not used to this level of customer support from any tool developer, let alone from a company that made a game engine used by literally millions of developers. Consider my hat tipped, Unity Technologies, this is the kind of work that will encourage me to recommend the engine to more people I know and to continue using it myself despite the ever-growing competition. Don’t expect me to stop complaining though. Never.
P.S. Just in case you are curious about what exactly my problem was, here’s the Editor script that Sander made for me: https://goo.gl/AxG6Kc
CMD+E to find source asset (mesh or prefab) in Project
CMD+H to hide or shot selection
CMD+F1 for Top-Down view
CMD+F2 for Left-Right view
CMD+F3 for Front-Back view
CMD+` to switch between Perspective and Orthogonal views
This article assumes that you’re already familiar with UE4’s UMGs and its elements such as Multi Line Editable Text Box. If not, please consult the official documentation. Unreal Engine version used: 4.8 Preview 2 for Mac OS.
I have been fiddling around with Unreal Engine 4’s UMGs (widgets) a lot lately. At one point, I wanted to make a multi-line text box that would scroll up automatically as you type. First I thought that it would just handle this sort of thing out of the box, but it turned out that there was no auto-scroll option available. Then I thought that it wasn’t a problem – after all, how hard could it be to count the lines and then just move the whole text up?
Well, it turns out that it is not that straightforward if you are using the standard Auto Wrap Text option. See, in C++ what Auto Wrap Text does is it probably adds “\n” at the end of each line as you would expect. Of course, you have no way of knowing that from Blueprint, because, as far as Blueprint is concerned, “\n” does not exist. Even though there’s a line break character available to Blueprint users now, it is not “\n.” Therefore, you won’t be able to find it in your Text or String (side note: to make a line break, press Shift + Enter in any text field).
Neither of these will find anything:
So yeah, I had to circumvent the native word wrap system and implement a greedy algorithm to be able to auto-scroll. Luckily for me (and not necessarily for you), I’ve got a monospaced font in my TextBox, which means that I don’t care about each character’s individual length. If your font is not monospaced, you should probably look for a different solution (likely involving some C++). The image below is the algorithm pseudocode shamelessly stolen from Wikipedia:
First of all, here’s the Pastebin of my word wrap function. Just copy all the text from here and paste it into the blueprint editor inside a newly created function: http://pastebin.com/BuxVnnH5.
Don’t forget to add two inputs: an InputLine string and a LineWidth int (my default value is 47); two outputs: WrappedString and NewLinesCount; and four local variables: SpaceLeftInLine int LineCount int, FinalString string and Words array of strings.
Now that we’ve got a properly wrapped string, we need to make the auto-scroll. If you don’t need the auto-scroll, then, of course, you can stop reading right here.
Since the auto-scroll will handle the whole string, no matter how many times you modify it, I feed my function the NewLinesCount variable that I get from the word wrap function. The vertical scroll function Pastebin: http://pastebin.com/Z4WqhHYG, also a function. This time with only one input: NewLines integer. Add two global variables: LinesLeft int, initially set to the total number of lines allowed in your text box and a reference to the text box itself, as I’m using it inside the function. Finally, create one local variable: LinesSeparated array of strings.
Connect the Wrap function to the Vertical Scroll function – and voilà!
Now, I know that copying and pasting functions in Unreal is not the best experience in the world, so I’m starting and open-source Open Blueprint project on GitHub. In OpenBlueprint, anyone and everyone can share useful global Unreal functions and macros. The first function in there will be my word wrap function. Here’s the GitHub page where you can clone or download it: https://github.com/krides/OpenBlueprint
As I am writing these words, I am seven months into my first full-time video game industry contract. Hello, my name is Sergey, and I am a game designer working on Rime for Playstation 4 at the Madrid-based studio Tequila Works.
Unlike many other articles of this sort, the goal of this one is not to convince you that the road that lead me to where I am is the one that everyone should take. I don’t believe that there’s any rule of thumb for reaching your life goals, and I highly discourage you, dear reader, from making the same life choices that I did.
This post is above all something that a 15-year-old me would want to read. It is a complete and true account of someone’s journey toward a goal that seemed to be so distant, dubious, but matching his own. Perhaps this story will give some of you a little bit of confidence that all of us lack from time to time. Maybe it will help you understand the difficulties that you might face in the increasingly near (trust me on this) future. Or, at the very least, it could mentally prepare you for these difficulties.
If you are an aspiring game developer, rest assured, there are many of us who know exactly how frustrating your path can be at times. I understand how easy it is to lose your resolve. I’ve been there, and I want you to know that you are not alone and that there are people out there who understand how difficult it is. I also know that hearing this from someone who is already in the industry is not the same as hearing it from your peers, your friends or your mom. At least it would be different for the 15-year-old Sergey who wanted to become a game designer.
This account is completely personal and in a way, selfish: it is a continuous, solid line under the first chapter of my life, which ended on September 30, 2014. Hopefully, it will help me remember where I started and why I got myself into this horrible, yet wonderful mess that is the video game industry. Besides, it’s raining in Madrid, so I have nothing better to do tonight.
This story, not unlike many others, begins with a 10-year-old kid who decides to make a video game without having any idea of how to do it. Before that kid started his journey toward one of the most closed and surprisingly conservative, entrenched even industries in the world, he had a very particular kind of childhood. I think you might relate, so let me set the scene for you:
We are in Kiev, Ukraine, the city where I was born and spent most of my life (which explains my appalling English, by the way). It is May 28, 1997, and I’ve just gotten my first computer as a birthday present. I am six years old, and this is one of the happiest moments of my life. My father installs Doom from a pirated “500 best games” (or was it 1000?) CD, and we play it together (he walks and aims, I shoot monsters by pressing Control). We also finish Wolfenstein 3D, Hexen, Heretic, Quake and Quake 2, Duke Nukem and many other FPS games in the same way. A year or so later, his computer engineer friend (uncle Zhenya) comes over to do some upgrades to our PC and installs C&C. I am amazed. I spend the next few years playing Warcraft and Warcraft 2, Tiberian Sun, Red Alert and Red Alert 2, Dune 2 and Dune 2000, Commandos, Heroes of Might and Magic II and III. I also play racing games, RPG games, flying sims and many, many other classical and modern PC games.
Later, my whole class at school plays first baseball cards, then pogs, then Pokemon trading cards and eventually Magic: The Gathering. All the while we play Half-Life, Diablo 2 and Prince of Persia at home and skip classes playing Counter-Strike and Warcraft 3. You get the idea.
Since my childhood was all about games, it was only natural for me, in the autumn of 2001, to start plotting the creation of one. I sketched a couple of characters with various abilities in an old notebook that I found somewhere in a forgotten drawer; then I enlisted a couple of my classmates, including a pretty good artist, to help me. Unfortunately, apart from her, no one on our little team had any useful game development skills (us being a bunch of 10-year-olds). I was the only one with enough motivation to start asking around about the tools used for game creation.
I kept asking and asking, and eventually I started to get answers. At some point, I asked a guy who was selling pirated software. Side note: Dear reader, please don’t be shocked, I am not a horrible person; all the games and software sold in Ukraine were illegal at the time. The guy was knowledgeable enough to point me to 3D Studio Max and Maya 6, and I was curious enough to demand how to make things move. For that, the guy said, one needed programming. This little conversation marks the moment when my quest for tangible game development skills began.
I got my parents to buy me a book on Maya and another one on Borland Delphi (suggested by the salesman who said that it was a good language and tool for beginner programmers). I was excited to start learning something that would finally allow me to make my first video game, just like Etherlords or Cossacks (I was increasingly aware of an emerging domestic game industry). Unfortunately, this didn’t quite work out as I lacked some basic knowledge about things like coordinates and coefficients. For some reason, the books’ authors assumed that their readers were technical enough to understand the basic principles of physics and mathematics. I did manage to create a program that calculated distances based on speed and time, which made me proud of myself, even though most of it was a direct copy-paste from a tutorial. I ended up making a pen-and-paper RPG that was suspiciously similar to Diablo and Dungeons & Dragons.
Fast forward four years: I’m about to graduate from middle school, I like writing, I hate mathematics and I like physics even though I don’t get terribly good grades. I manage to win prizes at writing and physics competitions for middle-schoolers somehow. I have also started my first informatics class where I think I am the king. I have discovered the Internet way before my classmates and spend most of my time posting song quotes on my personal blog. Another thing I do on the Internet is hanging out on a forum of my favorite local weekly magazine about computer games called My Gaming Computer. It is through this forum that I discover that writing reviews of video games is very similar to writing book reports, so I try writing one and, of course, fail miserably.
It takes me two more attempts and a co-author to finally publish my first article. It is a two-page preview of a dinosaur RTS game called Paraworld. I will never forget seeing my name and obnoxious teenager nickname next to an article in a printed magazine for the first time. For the 14-year-old Sergey, this is a big deal. I write and publish more articles, some with co-authors, others on my own. I don’t know how many there were exactly, but I know that the magazine didn’t pay the majority of its freelancers and owes me money to this day. Naturally, I jump at the first new job opportunity I see, and that job opportunity is PC Gamer Ukraine. My Battlefield 2142 review helps me land the job. Now I’m a staff writer and translator (almost half of the articles in the Ukrainian PC Gamer were translations of pieces from PC Gamer UK). With my first paycheck, I buy an iPod Classic. I am 16 years old.
While working at PC Gamer, I get to witness the 2008 indie revolution first hand. World of Goo makes a lasting impression on me; Braid becomes a massive hit, Castle Crushers comes along, and I decide that this is exactly the kind of games that I want to make myself. At this time, I’m also forced to make choices about the rest of my life. The end of high school is approaching at lightning speed, and it is then that I remember my childhood fantasy of making video games instead of describing them to other people.
I start listening to game design round tables from regional game development conferences, I read articles on local game dev websites. Someone recommends Gamasutra to me. I pick game design as my discipline, because I figure that I’m not technical enough to be a programmer, but not artistic enough to be an artist. It is from the round tables that I understand what being a game designer means, and how you are supposed to be both technical and artistic. I read more articles and books.
As my high school graduation approaches and exams are just a year away, I announce to everyone that I’m going to apply to KPI (Kiev Polytechnic Institute), the post-soviet MIT. My parents are shocked, but also relieved. They’d just seen me go through a phase of asking for an electric guitar for birthday. By comparison, going from a literary and humanities high school to a technical university seems like a great idea. I take intensive math classes and get into KPI. My major is Computer Science and Industrial Automation for chemical engineering.
At some point, I attend the Casual Connect conference in Kiev as a journalist and try to find a summer internship at one of the companies that are present there. It is terrifying. I talk to a bitter 50-something game producer who does everything he can to discourage me from trying to get into the game industry. He tells me that my writing is good and that I should just stick with what I know, that production is messy and is a bad idea for most people, that he regrets ever setting foot in the industry. After three days of hearing people talk about player acquisition, retention, ARPPU, DAU, etc., this almost makes me cry. Nevertheless, I keep insisting that I want a job in the industry and that an internship at his company would be a great opportunity for me to learn the ways of production. In the end, he gives me a test task and a business card. I have to write an initial treatment document for an original game concept by the end of the week.
I do write the document within the deadline, send it over with a cover letter, and then get a phone call from the producer. He says that my concept is great, that it is clear and well-written, that he wished his designers would produce documents like this. Then he says that he has to admit that he never had any intention of hiring me in the first place and didn’t think that I would take the task seriously. I say thank you and hang up the phone. A couple of months later I go to college.
Luckily, I already know how to code, and I’m inexplicably good at physics, so I manage to get through most of the bachelor’s program without having to invest too much time. I prefer to spend it studying game development things, including C# and C++. The only problems I encounter are with the chemical engineering side of things. I fail Chemistry 101 twice before finally passing it, and the same goes for Materials Science 101.
I pass both in the end, and I graduate. Now that I look at my thesis about automation of petrol production, I can’t help but wonder how much money I would be making if I had any intention at all to follow a career in engineering. But that’s beside the point because I spent all four years of the engineering school studying first game design and then…Unity 3D.
This next part is a little messy because of all the things that were happening at once, but bear with me please, because it eventually leads us to where I am now.
Unity came along when I was halfway through my bachelor’s program. I started doing tutorials and enlisted a friend to help me with my first game, Missy & Mandelbrot, spiritually inspired by World of Goo. It never got finished, but it helped both of us learn a lot about Unity and game development in general. The current version of Unity at the time was 2.5, and the community was small but also very friendly. This is where I made my first contacts in the game industry. Some of them became my friends.
In the winter of 2011 I was getting slightly desperate with M&M and needed a break. It seemed like the game had a lot of design flaws, and I had no idea how to solve them. Besides, my friend, who was helping me with it, could not invest as much time into it anymore.
Then I heard about a competition for Unity games with Flash version and decided to participate, which is how Dédale happened. Although I didn’t win anything then, I got a lot of encouraging feedback from people on Unity forums, and everybody kept saying that I should make it into a mobile game. I did exactly that. I released Dédale in the summer of 2012 for iOS, Mac, and PC. My first finished commercial game. It seemed like my whole life up to that point was leading up to that release.
At some point in the summer of 2011, I decided to study French. My English, I figured, was bearable, and it was time to move on to a new language. I counted studios in different countries on Gamedevmap, and it turned out that French was the second most popular language in the game industry. I signed up for a class right away. The reason I mention this here will become clear a little further.
While everyone was making fun of Google+, we probably had the biggest community of independent game developers around on that social network. Discussions were lengthy and interesting. You could ask for help, and help would always come. At some point, I was bitching about game design education programs (to this day there’s no such thing in Ukraine), and someone showed me a French graduate school that was free for international students. I just needed to pass the entry test. I decided to apply the following year.
So yes, in 2012 I was working on my bachelor’s thesis while studying French, preparing for my entry exam at a French grad school and working on my first commercial game. Work-life balance, you say? Never heard of it. In fact, that same year my parents got divorced, and an old friend of mine died in an accident. I will probably never forget that year because it would also be the year when I moved to a foreign country.
Since I didn’t know whether I would get into the grad school, I was also browsing game design jobs in the local game industry. Google+ came to the rescue once again. Someone from a major international studio with a Kiev office commented on one of my posts, I contacted them and eventually landed an interview. I did three interviews and got the job.
More or less at the same time I also got my results from the French grad school: I got in.
I had a few days to decide: start my career as a game designer the following week or postpone it for two years and spend those years getting better at what I did while working on personal projects. In the spirit of the indie movement that I found myself involved in, I chose the latter. Three months later, I found myself in Angoulême, France, and one of the few people in that master’s program who’d already worked on and released a commercial video game.
During my French séjour, I made a couple of games that got some press attention (although nothing major) and helped me learn a thing or two about the inner workings of the industry. Paradis Perdus, Lune, Pineapple Dreams, 74:78:68, Event and Spotlight all got featured on multiple video game websites and even in some magazines. Dédale helped me realize something: it turned out that making a game wasn’t enough; you also had to let people know about it. Otherwise, you would have a game that nobody played, and there’s nothing more frustrating in the whole world.
More to the point though, the two year game design program at ENJMIN (that’s the name of the school) let me do just what I was hoping to do: whatever I wanted. I could dedicate as much time as I wanted to making anything I wanted, and I still believe that this is the only valid approach to learning game development. Just start making stuff, and skills will come. If you finish something, you can add it to your portfolio. I kept at it for two years.
Most French universities require you to do a professional internship every year, and ENJMIN is not an exception. A three-month internship after the first year, and a six-month internship after the second year of studies. If you don’t do the internship, you don’t graduate, it’s that simple. My first year internship was at Unit 9, a UK media production company that happened to have an app and game development department. Working on a mobile runner game was fun, but pretty far from my vision of my future career in the industry.
In winter 2013-2014, there was a conference at ENJMIN, and one industry professional came to our school as they often did. He told me that he knew people who work on Rime, and that caught my attention right away. I sent him an email; he forwarded my resume to Tequila Works. I got a reply; we had a couple of phone calls, and eventually they gave me the internship. Six months later, after my internship was over, I got a real contract, which is where I am now. Game Designer is officially my profession, as in I’m getting paid for contributing to a video game.
Can you take exactly the same path and land a job in the industry? Probably not. Chances are, you come from a different background and live in a different country. On top of that, the industry itself is very different today. Way back when people kept telling me that the only way to get a game design gig was through game testing, but that wasn’t true for me, and once I actually got refused a job as a tester because I was overqualified for it. Some people said that doing modes is the way to go, but that is clearly not the most obvious path now that Unity and Unreal Engine 4 are out in the wild.
To be honest, I have no idea how you, dear reader, will break into the industry. The industry is weird and small, and doesn’t like newcomers. It likes people with experience, and to get that experience you need to have experience. It has always been this way. It will probably still be true ten years from now.
The only piece of advice I can give you that I personally believe in is this: don’t give up. Keep trying. Keep doing stuff and keep showing it to people. Keep asking for feedback and keep improving your skills. Keep applying for jobs and internships and keep teaming up with other people (un)like you to boost your collective game dev-fu. You’ll get there eventually.
Oh and one more thing: don’t listen to bitter and cynical people who sound like they have it all figured out. Nobody does, so it’s okay if you don’t either.
This short story is a prequel to the game that I’m currently working on, Event. With Emmanuel Corno, who is the other game designer on the team, we figured that it made sense for the Story section of the design document to take form of fiction instead of your typical factual description, so today I wrote Europa-11 in an attempt to give some background to our world and characters. In a way, this short story is an experiment, as it is the first piece of fiction writing I’ve ever done for a video game that won’t appear in the video game itself, so I guess you could call Event transmedia now (haha, no fucking way). Does not contain spoilers for the game. May contain typos. Probably does. Yeah.
The world has changed: it is good now, united, peaceful. When the whole planet became a single political and economic entity, it was hard in the beginning, and restructuring took long years of strenuous labor on the part of our citizens and the Party. Finally we are able to put all of that behind us and start seeing and enjoying some tangible results of our work. The year two thousand and eleven has shown that we, the human race, are finally prepared to move forward. It is with your help, citizens, that the Party and the United Industries were able to accomplish some amazing things that will contribute to the well-being of human womyn and men today and for generations to come.
Thanks to your trust and support the Eden program colony was reopened on Mars. We finally implemented the space ed reform in schools and colleges all over the world, and our physicists synthesized a new promising model of the Big Bang, which brought us one step closer to under- standing of the world’s genesis. Thanks to a recent breakthrough in ecological research and with the help of the hard-working womyn and men of the Sustainable Manufacturing Supervision Com- mittee, we reduced the carbon footprint of the space sector by an astounding twenty-six percent.
The most important breakthroughs, as usual, came from the private sector. The ITS Corporation, founded by Ms. Anele Johnson, our distinguished citizen and pioneer civil space explorer, in collaboration with SpaceX, launched a new scientific mission to Europa with fifty-two specialists in various fields on board. If the extraterrestrial life exists in the Solar system, we are going to find it. If not, this mission will significantly improve our methodology and ultimately strengthen our expertise in detecting life forms on unexplored planetary bodies.
Two thousand and eleven was truly the first year of the rest of the United Earth’s history. We all have a goal now, each and every one of us feels like they share a common purpose. Everything the United Earth does is directed toward space exploration, and our economy has reached its historic high. Our predecessors built death machines, looked for more effective ways to destroy each other and ultimately our home world. All of my energy and that of the Party have been directed toward broadening our horizons, and you can see the results of our efforts today. It is a truly exciting time in our history, and the way our culture is shaping up continues to—
Greg flipped a switch, and the screen of a giant retro television set went black with a high-pitched analogue squeak.
“Hey, I was watching that!” Stacy didn’t sound amused. Greg knew that she watched each State of the Union address religiously. First, the United States, now, the United Earth. It didn’t matter, they had to be at the HQ in twenty minutes, and they had a pretty good idea of what President Shijuyama was going to say anyhow.
“Look, we are late, and if you wanna talk to Josh, we better get going,” he said.
“This is going to be the first State of the Union that I miss.”
“You can watch it on UStream on the way to the HQ.”
They didn’t have time to argue, and Greg knew that Stacy, despite her childish attitude, was capable of understanding that.
“I’ll watch it on YouTube tonight,” Stacy groaned and slowly lifted her massive body from the couch. Judging by her shiny forehead, that took quite a bit of effort.
“I’ve got the remote.”
Greg was the first to leave the house and reach the driveway. He stopped on the sidewalk, flipped the car remote in his left hand and pressed the Call button on it.
“Agh, it”s fucking chilly,” said Stacy, catching up. Then she stopped, drew a breath and added, wistfully: “I wonder how Josh is doing out there.”
That day’s Google Car was a green Chevrolet sedan with a cheerful ITS ad on both sides. The car looked new, but Greg could already smell a mixture of piss, coffee and baby formula when he opened the door. It was amazing how similar the suburbs were to the city in that regard.
Greg got into the front left passenger’s seat, Stacy occupied both of the back ones, assuming a half-lying position and pulling a tablet computer from the ceiling.
“International Transport Spacelines Headquarters,” said Greg, holding the Destination button on the remote control. The Google Car audio system made an electronic feedback sound that vaguely resembled a water drop, halted for a few seconds, processing the command, then asked in an unnaturally polite male voice:
“Please confirm destination: International Transport Spacelines Corporation HQ, Palo Alto, California.”
“Yes,” replied Greg, putting the remote control back into his pocket. The car made another annoying sound and started to move.
As the Google Car pulled up by the front entrance of the ITS building, Greg realized that the radio had been off all along. Not only that, but Stacy did not attempt to watch what was left of Shijuyama’s State of the Union address on the car’s tablet computer that she had grabbed in the beginning of their journey. It was either that or she had been watching it without sound, which made even less sense when you started to think about it.
When the car came to a complete halt, and the on-board sound system played its ever-annoying upbeat tune that was meant to announce the end of the trip, Greg turned toward the back seats and leaned forward in an attempt to get a better look of Stacy’s face.
She was lying on the seat, wide awake, concentrated on the car’s plastic ceiling. The tablet was back in its original slot. Greg didn’t say anything, sitting still in a taxi driver position, trying to read Stacy’s mind—a skill that he never quite learned during the eight years they’d worked together. After a while, Stacy looked at him and said:
“I am still thinking about Josh. How long has he been there?”
“I don’t know…a few months, maybe?”
“And before that, the training camp? Who are those people he’s with?”
“A few more months, and, well, half of them are ITS and SpaceX staff, the other half—”
“The question was rhetorical. I know who those fucking people are, I helped pick them myself—”
“—I just don’t understand how he could volunteer. How could it be only his decision? I know I encouraged him to follow his ambition, but—”
She didn’t finish her thought, but Greg knew what she was going to say.
“I have been getting pretty worked up about it. It’s just hard, you know? My name is still Johnson, and you know what happened to my mom when she—”
She didn’t finish her thought, but Greg knew what she was going to say.
“And now Josh goes the fuck knows where. Fucking Jupiter, of all places, seriously?”
Greg wanted to calm Stacy down, but there was no time left, and he knew that the signal from Europa was stable enough to have an actual conversation only for very limited periods of time, and only God knew when the next opportunity to talk to Josh would present itself. On the other hand, he couldn’t have Stacy scream and shout in the control room like she did the previous time. Partner at ITS or not, she still had to follow the security protocol in there, and it was Greg’s job to make sure that she did.
“Look, Stacy, we’re at the HQ, how about we—” he began.
“Do you realize how fucked up this whole situation is? Just tell me this, and we can go.”
Stacy didn’t sound hysterical, which was a relief. If he’d been in her place, Greg would probably have been beside himself. The situation was fucked up, alright.
Stacy’s mother, the infamous Anele Johnson, founder of the ITS Corporation, disappeared along with her partner and the first prototype of a Nautilus-type tourist space yacht. Some cruiser spacecraft captains claimed that they’d seen or detected the presence of the vessel in Alpha Centauri and at the far reaches of the Solar system. In some of these stories the ship sent out distress signals, in others all of its communications appeared to be dead. What all of them had in common was that they were invariably creepy as fuck and always ended with the ship disappearing without a trace. This made Stacy’s mother a ghost lady in addition to being an important historic figure.
“I understand, of course I do,” said Greg, “But who’s to say that Josh will disappear? I’m pretty sure that it would be completely impossible to make that guy just vanish. No ma’am, not with that ego.”
Stacy didn’t smile, but her eyes stopped being cloudy, and her gaze was fixed on Greg now, which was a good sign. He continued:
“Now, c’mon, let’s go inside. Promise that you won’t make a scene. It’s not a good time for that. You can be as angry at Josh as you want when he gets back. Right now he really needs our support.”
Greg helped Stacy get out of the car and, when they both were on the sidewalk, he pressed the Leave button on the remote, sending the vehicle on its way. It was kind of amazing how fast everyone got used to all things AI-controlled. At first the general public was suspicious of the new technology, just as they were suspicious of augmented reality glasses before, and of cellphones before that, but ten years barely went by before the remaining human-controlled cars, trains, boats and airplanes were voted out of existence by a Party committee as unsafe and obsolete. Then, after years of extensive testing and experiments, ITS launched the first computer-operated spaceship, making artificial intelligence a de-facto standard in the transportation industry. From that point on it was only a question of time before the public perception of the new technology caught up with the demand for it. This made Greg think of the new model of the AI module on board Europa-11, Josh’s ship. He shook his head and followed Stacy into the building.
The ITS lobby had a clean futuristic look that you would expect from a lobby of a major tech company. True, ITS was as much about tourism as it was about building spaceships, and the giant aquarium with live squid on one of the walls attested to that, but the overall Kubrick-esque vibe of the place helped sell tickets just as much. People who came there got the first taste of the space travel experience they were expecting. Comfort out of their comfort zone, as the corporate slogan went.
Apart from the front desk clerks and the customer service specialists, none of the ITS staff worked in that part of the building. It was unusual for someone from the engineering or the senior staff to enter through the front door, but Greg figured that it would take longer to give directions to the Google Car than to walk from there to the control room.
The rest of the building was all function and none of the form: white suspended ceilings, gray concrete walls and gray tiled floors. Visitors weren’t allowed to go beyond the front desk, and people who worked at ITS typically didn’t care about fancy lamps and ergonomic furniture. There were occasional colored poofs and bamboo trees in halls and corridors of the closed part of the building, but the overall atmosphere was very no nonsense.
Stacy and Greg were passing the security check downstairs when an alarm went off in the control room. This could mean a lot of things, and Greg wasn’t one to jump to conclusions, but they really had to hurry up, besides, they were already five minutes late, and, as the ITS engineering team’s saying went, spacetime was not nobody’s bitch.
“There you are, come on, we’ve got a situation,” said Ellen, chief engineer of the Europa-11 program, stepping out of the control room. She looked even more disheveled and preoccupied than usual.
“What is—” started Stacy, but didn’t finish, because they had stepped into the control room, where everything immediately became crystal clear.
The room was in absolute chaos: people were typing on their computers frantically, some were running from one workstation to another, giving and receiving orders, a few dozen phone conversations were happening at the same time. A giant computer screen on the wall showed a wireframe image of the Europa-11 spacecraft on black background with all of the vital data regarding its systems and crew floating around. The ship didn’t look good: most of it was red, with little blue dots representing the crew members. Stacy gasped and let out a muffled cry of terror when she saw that there were only eleven of those left. The life support system was in critical condition, as reported by a giant red warning message near the bottom of the screen. All but one Kliper escape pods were disconnected and drifting away from the ship. None of them had any crew members inside.
“Fuck me,” said Greg once the initial shock passed, “What happened? Where’s the crew?”
“We aren’t sure yet,” said Ellen, “It looks like sabotage. Most of the crew are dead or missing. We were able to establish contact with Josh who was in the systems bay when this happened, but the connection is unstable.’
“Is he okay?” asked Stacy.
“He is, but he won’t be if he doesn’t get to the remaining Kliper. We contacted all of the spacecraft in the vicinity, but it is unclear whether they will be able to get there before the life support system goes offline.’
“Which systems are still nominal?” asked Greg.
“We were able to restore door control and heating with Josh’s help, but the AI of the ship is blocking any further intervention, invoking the Code Red security protocol,” said Ellen.
“Shit. I knew this thing required more testing,” said Greg and then bit his tongue, realizing that it was he who signed off on the security status of Europa-11 before its departure.
Before anyone could say anything else, Josh’s userpic appeared in the top right corner of the screen, and the room went completely silent within a couple of seconds. Then the audio stream loaded, and Greg started hearing his friend’s voice. The connection quality was so poor that the words were barely intelligible:
“…I…this is not…left…the real threat…Klipers…transmission…is jamming…’
While these words were coming out of the speakers, all but one blue dots disappeared from the plan of the ship, and the life support system warning now signaled that it was no longer operational. Josh appeared online when that happened, although the blue dot in the systems bay had already disappeared. Then a loud gasp came out of the speakers, followed by radio silence.
The room was completely still, no one moved, you could hear a pin drop. Everyone was looking at the screen, waiting for something to happen. After all, there was only so much they could do from the control center.
Greg was hypnotized by the remaining blue dot, making its way from the bridge to the remaining Kliper. It wasn’t Josh.
Josh was gone.
The realization didn’t come at once, but when it did, Greg threw a quick glance at Stacy. She was standing next to the nearest computer terminal with a headset in her hand, looking at the screen blankly. She didn’t have time to say a word into the microphone before the connection broke. She was shivering.
Over the next five minutes, the dot moved, really slowly, on the ship’s wireframe plan. The moment it reached the Kliper, the screen went black for a split second, then Europa-11 reappeared, this time without any indication of the Kliper or the remaining crew member. The wireframe of the ship was green again, as if nothing had happened. The on-board AI was broadcasting a distress signal in all directions.
Ellen broke the silence by saying in a dry, emotionless voice:
“The reactor. It’s gonna blow.’
Everyone in the room knew that those were the last ten seconds of the Europa-11 program. Someone cut the alarm off.
Stacy started to cry softly as she sat down on the floor, trembling, unable sustain the pressure, giving in to gravity. Ellen, Greg and a few others followed her example.
Game design is often perceived as an idea-driven discipline in game development, and many people tend to fall into the trap of thinking about it in terms of pure creativity, neglecting the technical aspect. While having ideas is nice and useful, it is important to remember that everyone has them, and that an idea per se does not have any value. What counts is the implementation, and while writing a concept document or pitching a game may seem like an exciting task, it constitutes a relatively small fraction of the actual work a game designer has to do on any given project.
I am not going to go into more detail about ideas, concepts, team brainstorming and creativity, as this is a broad subject that deserves a separate post. What I would like to talk about here is the technical aspect of game design, various production techniques that can be applied in different situations and help the designer solve complex problems with less effort and more precision. The real game design work begins after the team (even if it is a team of one) has agreed on the concept, and yet I feel like we are barely scratching the surface of production techniques.
If you are anything like me, you are in constant search for game design cookbooks, opinion pieces and postmortems, and are probably familiar with the brilliant work of Jesse Schell, Brenda Romero (Brathwaite), Ian Schreiber, Daniel Cook and others. I am not going to try and rethink game design as such in a short blog article. In fact, everything that I am going to say is compatible with and relies upon these groundbreaking works from before. What I would like to do here is share some of the thoughts on the subject of methodology that I accumulated over the last few years, compiling them in a unified theory that I like to call Levels of Abstraction (LOA).
Perhaps the best analogy for LOA is a YouTube video that my teammates and I used to watch daily while working on Paradis Perdus. It’s a half-mocking that-part-of-the-internet kind of thing which demonstrates the “cuil theory” through text pieces that become more and more abstract as the number of “cuils” (or levels of abstraction from reality) increases.
In simple terms, LOA is a framework that allows the game designer to apply their analytical skills in the most useful way possible. Abstraction is important to understand and balance the game system correctly, and different situations demand different levels of it. The game designer chooses the LOA which is most suitable for them at any given moment, thus ensuring the effectiveness of their work. For example, sometimes the best way to describe a mechanic is drawing a simple sketch and talking to the concerned team members, while in other cases a spreadsheet, a text memo, a flowchart or even programming code will be the best medium.
Suitable LOA for any given situation can vary based on the team, budget, production timeline, development phase, game type and dozens of other factors. Each LOA is a tool that lets the game designer formalize and communicate a game system better, thus giving them more control over it. All and all, I am going to discuss five LOA that I have most often encountered myself while working on various solo and collaborative projects. It is also important to note that multiple LOA can (and should) be combined within the same project and often even the same mechanic.
Discussion is highly encouraged, and, if you have something to contribute to the list or have something to say about one of the LOA, don’t hesitate to drop me a line or leave a comment. This post is reference material, so I am fully prepared to update and improve it if need be.
Level 1: Empirical
This is the level of abstraction that manifests itself as a an explanation in plain words. It is probably the most common way of explaining a game mechanic, often used in introductory sections of game design documents, on websites and in communication between team members.
The character cannot do anything if she does not infect a host, even move. When one of the enemies is close enough to the character, the player can press the action button, which will make the character infect the enemy and give the player some new powers based on the things the enemy can do.
Often this kind of explanation is accompanied by a simple illustration that shows how the game mechanic works, making the description clearer. These images are rarely very detailed and usually look like simple mock-ups of the game scene with some additional graphics for explanation.
LOA1 does not include any formulas or values and can be applied in situations where something has to be written down, usually because the game designer wants to remember something and come back to it later or to explain a mechanic in simple terms when no technical information is needed.
If LOA1 is used as the only way of explaining a game mechanic, it is the programmers who define how the mechanic actually works, finding the correct formulas and dummy values that the game designer comes back to later. This method of work often demands reiteration, but can be a good idea in case of small teams (including teams of one) with team members wearing different hats and having significant control over multiple aspects of the game. In a medium to big team, however, this may become a problem, due to the time it takes to reiterate.
Level 2: Schematic
This level of abstraction goes further and represents a game mechanic or a gameplay element in form of a graph or table. Thus, the level of abstraction is higher, and the gameplay is described more formally. It is common to see the second LOA in combination with the first one as a way of formalizing the system. Some mechanics and game situations are also better represented using the second LOA and don’t need LOA1 at all.
This is a common way of representing level design on a local (mission) or global (world) level without doing any 3D modeling. It is also often used for balancing skills and actions using numeric values without changing the underlying formula.
It is generally a good idea to use LOA2 if you want to show and/or compare large arrays of data that would take too long to explain or write down empirically. Like LOA1, LOA2 does not allow for direct control of game mechanics using designer-made formulas. However, it may use the existing formulas for data representation and calculations.
Level 3: Maths
This is similar to LOA2 in that numeric values are being used, but apart from that the designer creates their own formulas to represent game mechanics, making the level of abstraction even higher than before, which gives them a higher level of control and a better perspective.
This level of abstraction is something that demands extensive testing before being implemented: either on paper or using a digital prototype of the mechanic that allows to modify the formula and different variables on the fly. The advantage of this method is that the game designer can achieve exactly the result they want by describing the mechanic in this way, allowing for more fine tuning and precise balancing. The downside, however, is that it may take some time before the programmer will start actually coding the mechanic, since the prototype must be tested before the implementation of the mechanic begins.
In general, LOA3 is good when maths matters: all kinds of logic puzzles, role-playing elements, and in general games that can be easily prototyped on paper without significant changes in their gameplay. It is probably an overkill for things like jumping, movement and everything else involving velocities, accelerations and other conventional physics variables.
Level 4: Algorithmic
The fourth LOA takes into account the ones that come before it. The game designer formalizes the game mechanic writing it down in form of a flowchart, pseudocode or simply as a sequence of actions with precise instructions.
It is hard to use this LOA for balancing things because of the nature of data representation, however, it is good for checking if the logic of the game mechanic works and making sure that things happen in a correct order. If this is not checked early in a complex game system, things can go very wrong later, and it will cost much more to repair them when they do.
LOA4 can point to specific values and formulas or not—this is entirely up the the designer. One should take into account the reason for using LOA4. If the goal is to test the whole mechanic on paper exactly the way it is going to work once implemented, then it is generally a good idea to use the real formulas and variables. In this case, automated finite state machine tools such as Playmaker for Unity may help. On the other hand, if LOA4 is used to determine whether the logic behind a specific mechanic works and to mitigate the risk of a wrong order of actions, a simple flowchart or an algorithm sequence can be used without getting too deep into the specifics of each state.
In my experience, both small and big projects can benefit from LOA4, but the designer should use it with caution, because it generally takes a long time to formally write down an algorithm for a complex system.
Level 5: Programming Code
It is often a good idea to write a mechanic (or a part of a mechanic) down in code directly, bypassing LOA4, because producing formalized algorithm structures for complex systems may take a significant amount of time, not necessarily justified by the usefulness of the final result. This is usually the case when the algorithm in question includes a lot of of small and/or simple actions which could easily be represented in code, taking up less space and demanding less effort and debugging. The image below shows that sometimes one line of code can replace a whole bunch of finite state machine actions.
It is generally a good idea to write down either very specific or very isolated mechanics this way, unless the game designer codes the game on their own, in which case writing everything in code is the only option anyway. Otherwise, the utility of this approach should be considered in terms of the formalization of the mechanic and its explanation to people who are going to implement it, so programmers. That is to say that you should consider it if you think that it is the easiest way to write a mechanic down.
Short Fiction was short and it was, believe it or not, fictive. It lived under the city, swimming freely in underground canals that carried people’s waste mixed with rainwater, surviving thanks to a thin membrane it was wrapped in. Some alleged that the membrane was pure willpower held together with an ounce of sodium bicarbonate, while others claimed that it was partially bubble-gum. The truth was, as usual, somewhere in between.
Sometimes an odd shoe or a small coin would fall through the cracks of human existence, and then Short Fiction would rush to the splash site, forgetting all of its daily errands, plunging headlong into the unknown. Contrary to a popular belief, baby crocodiles rarely made their way into the sewers, so the reign of curiosity over the sense of danger was overwhelming.
On one particularly rainy and cold day, Short Fiction found a boy in the sewers and ate him. The boy was thirty-six years old, had a gorgeous beard and was entangled in tumblr, tweeting recklessly, producing all kinds weird indescribable noises and spitting out pieces of wood, hashtags and small blue feathers. As Short Fiction found out from his driver’s license, the boy was a Steve. Not a John or a Mike, not even a Keith, but a real, living and breathing, Steve.
Not that the name was a problem, but Short Fiction found itself dissatisfied with the outcome of eating Steve. You see, every now and then it had to digest things that it had swallowed, including memories, facial hair and experiences of little boys and girls it found in the sewers. None of them were completely pleasant, nor particularly repulsive, and Short Fiction found itself producing bland and discoloured copies of itself, unable to sustain the vibrant micro world of the city sewers. It was stagnating, becoming greyer and lonelier every day.
Short Fiction was going to make it right with Steve. Finding hard spots on his otherwise soft and elastic head, it carefully connected a bunch of electrodes to his brain and drained it of all of its ridiculous bullshit. When there was nothing left, Short Fiction looked down, trying to rest its eyes on the familiar moist pavement. It was then that it noticed something in the bullshit puddle on the ground. The object was small, shiny and had a fractal structure. It revolved around the center of the universe, biting off pieces of its entropy, creating true form in a seemingly irregular environment of virtual semantics.
Short Fiction put Steve down and picked the object up. Upon closer examination, the short threads sticking out of the object’s numerous edges turned out to be words. If you looked even closer, you could see logical connections between some of them. They formed clauses and sentences, referencing each other and dancing around the center of the branch they were tied to.
Short Fiction thought for a second and ate the object.
All of a sudden, nothing was there anymore: Steve was gone, the sewers were gone, and the city became little more than an odd memory. Everything was clear, but hardly simple now: words were everywhere, and the virtuality rushed into the world, splashing and ravaging, making all of the words bigger and more numerous. Short Fiction started drinking from the stream, hoping that it would end, but it kept flowing, filling Short Fiction up, making it bigger and more spherical. This went on until its membrane burst, and it instantly became submerged in reality.
Then everything stopped. Nothing was revolving and making deafening noises. Everything was beautiful, but no longer really there. Millions of transistors sang their digital song, lulling Short Fiction to sleep.
IMPORTANT: It was brought to my attention that the newer versions of Unity have broken my little framework. Be advised, it was intended for the 4.* version of the engine. The new inbuilt GUI in Unity accomplishes the task nicely, and, if you’re using 4.6, 5.0 or a later version of the engine, you should go with the inbuilt tools, which are more efficient, reliant and versatile.
Unity GUI gets a lot of criticism, and there are reasons for that: it’s not visual at all, you have to code everything, most of the time you can’t use it on mobile due to high redraw frequency and excessive drawcall use, etc. I agree with all that. However, it has a number of important advantages as well: unlike most of the external libraries and plugins from the Asset Store, it is integrated into the engine really well, and it is completely free. You can’t say that about EZ GUI or NGUI, now can you? This last reason is probably the main thing that makes people use Unity GUI, and they are right to do so. Why pay $100+ when you can get the same result for free?
I have used Unity GUI on a variety of different projects and in a variety of situations. Over the years I have dug up a bunch of really useful tricks that help me code a basic interface in a matter of minutes. It was a couple of weeks ago that I decided to gather this stuff in one class that helps me overcome the main bottleneck of Unity GUI: its inherent lack of responsiveness. The term “responsive” comes from web design, and means that the interface adapts to the screen resolution. A modest example of responsive web design is my web site. Try resizing the window of your browser and see how everything scales and moves around. When you are making a cross-platform or PC game, you want it to be able to adapt to resolution as well, and to do that, the interface elements have to stick to their part of the screen and retain their size relative to the screen size.
My class is based on the basic idea behind Automagic GUI Scaling in Unity3D by Simon Wittber: you choose one resolution that you consider the “base” resolution, defined as WIDTH and HEIGHT constants in the beginning of the script (usually it will be the most popular resolution of your intended audience), then resize the GUI elements using GUI.matrix. The difference is that in Simon’s example you will end up with “offsets” on top and bottom of the screen if your screen ratio is different from the resolution that you specified with WIDTH and HEIGHT, which we don’t want. I solve this problem by applying the resize method to different regions of the screen separately, thus effectively masking the annoying offsets.
Now that we’re done with theory and explanations, you can get the GUISizer class here:http://goo.gl/wgdFX6. You can also download an example project here: http://goo.gl/5F5PIY. Or, if you prefer, the same thing as a UnityPackage that you can import into your own project: http://goo.gl/GKczDm. The source code is also added to the bottom of this post in case the links die. If you use C#, then feel free to put GUISizer.cs anywhere you want in your project. However, if UnityScript is your cup of tea, then it has to go into your Plugins folder. Otherwise you won’t be able to access its methods and structures.
With that out of the way, you can start poking around in the test project. To test how it works, just try resizing your Game view window. Below is some API with explanations, as well as the source code. Feel free to ask questions or suggest adding or changing things in the comments or elsewhere. You can use GUISizer in any commercial or noncommercial projects as you see fit. And of course enjoy the fully resizable Unity GUI!
P.S. As a bonus track, in the example project you will find a script that lets you position a GUI element in world coordinates (like the text label that you can see sticking to the cube in the video above).
This struct lets you create a variable of type GUIParams: a button or a label, with specific parameters that you pass to the constructor.
Draws a specific label. Takes a GUIParams struct as a parameter. You can also provide a custom style and a custom font size as a parameter. Additionally, you can provide text that will be appended to the label, which is useful for things like score and lives.
float WIDTH = 960;
The “base” width of the screen. This is the width of the screen relative to which all GUI elements will be resized. Change it directly in GUISizer script.
float HEIGHT = 600;
The “base” height of the screen. This is the height of the screen relative to which all GUI elements will be resized. Change it directly in GUISizer script.
This post was written for one simple reason: the question comes up way too often, and I keep saying the same things and listening to the same counter-arguments over and over again. On top of that, the majority of these discussions take place in discussion-unfriendly spaces, such as Twitter, IRC and, of course, the real life. I just want to write all of my reasoning down, so that the next time it happens, instead of engaging in a pointless debate, I will be able to just link this article.
I will begin by talking about my background in programming, which, hopefully, will give you some idea of the reason why I keep saying the things I say. If you (understandably) don’t care about that, you can just skip the next few paragraphs.
I started coding when I was 12. I had no Internet connection, no previous background in any related fields and no particular interest in science (except maybe physics a bit). On the other hand, by then I’d designed my first pen-and-paper and live action games, and just wanted to make video games. As it is often the case, it all began with someone telling me that you have to write code to make video games (which was the case at the time, but isn’t the case any more with things like GameMaker, CraftStudio and PlayMaker for Unity empowering the non-programmers greatly). So I went to the Kiev’s largest book market on a quest for a programming tutorial book. And I bought it. It was a book on Borland Delphi 6 suggested by a salesman as a perfect place to begin programming. I never managed to do much with Delphi: just a couple of Windows applications which helped me solve my middle school math problems.
A couple of years later, during my senior year in middle school, I began coding in Pascal (which is very similar to the Delphi syntax). It was then that I made my first simple video game: it was an interactive fiction thing based on one of my naïve teenager short stories, something that I would definitely suggest making in Twine. Then came the university with its C++ used for Windows applications (Borland C++ IDE was horrible!), later LISP, assembly languages, PHP and HTML. I tried going through C++ tutorial books and developing a game using DirectX. The bad news was that I found it tedious and just kind of drifted away from the idea. Instead, I learned C# and made a couple of Windows applications with it. It was then that Unity came along.
A beginner in Unity is likely to have a hard time with the vast new framework that Unity presents. And, frankly, no one needs to deal with both this and things like delegates, constructors, lambdas, events and interfaces. Extensive OOP and verboseness of C# combined with the huge new framework to explore is capable of making people quit game development altogether (C++ memory management combined with DirectX almost did it for me at one point). I know that for some of you the concept of a “difficult programming language” is hard to grasp, especially the knights of shaders and C++, but I assure you, some languages are more difficult to use than others.
Not only is C# more verbose, it also has a way of making you think about things that you would never have thought about. Things like this:
transform.position.x = 5; //UnityScript code
transform.position = new Vector3(5, transform.position.y, transform.position.z); //C#—notice the difference?
yield WaitForSeconds(2); //UnityScript code
yield return new WaitForSeconds(2); //C# code.
By the way, the call above is in a coroutine, which has to be started using StartCoroutine in C#, unlike in UnityScript, which does it automatically
This is a good thing for an experienced Unity programmer working on a medium to large project, but a disaster for a beginner trying to put together their first Pong clone. Beginners don’t need to think about optimizing for this kind of overhead unless there’s actually some kind of hiccup in FPS of the game (spoiler alert: there won’t be, not caused by this, anyhow). This stuff is just too much information to take in when you’re starting to use a new game engine.
And if that doesn’t convince you, here’re a couple of tweets from people who work (or worked) at Unity Technologies:
@Sirithang@krides that is incorrect. All languages in Unity are compiled to CIL an ran by the same Mono VM/JIT
To sum up, if you are a beginner in Unity, you might want to consider using UnityScript at first if you will not be working on projects that risk getting really big. If you are experienced in Unity and/or your projects are big, you might want to switch to C#, because it offers more possibilities in terms of encapsulation of classes and optimization, making your large projects more manageable. Namely, people tend to leverage the power of events in C# with great success.