Log #5: Don’t cry because it’s over. Smile because it happened.

And so we come to the final entry in this dev log. I just read all of my previous entries in this log and was absolutely amazed with how much this game has changed and how far it’s come during production.

From a narrative perspective, all of the dialogue has been written and most of it has been recorded by voice actors! It’s really exciting hearing the dialogue “come to life”. Having run a voice over session during my co-op, we did things in a similar fashion. First, I had the script written in Google Sheets:

lines
Spoiler alert! Here’s some of our dialogue!

As you can see at the bottom, there is a tab for each section of the game (Level 0 investigation – Level 1 interrogation). There are also tabs with all of the characters’ dialogue copied and pasted for recording purposes. This way, they don’t accidentally read the wrong line during the session. We found that the easiest thing to do was to have one laptop for the actor to read off of, and one for us if we need to make any changes. Since it’s in Sheets, the script would update for them and they could read the new one right away.

I’ve also written dialogue that will be specific to the Level Up build, which I’ll touch upon in the production section.

From a production point of view, communication is still the most important thing in development. Our team has had a bit of a problem with that since we haven’t been used to working in a team this size in our other years aside from sprint weeks. To fix this, I made a daily schedule for our team to follow and it’s worked out really well.

schedule
So many tasks!

This colorful weekly schedule has daily tasks for everyone. It’s important because a lot of tasks are dependent on whether or not other people have done their work. If something hasn’t been completed yet, it gives everyone other things to work on and promotes communication between team members.

And finally, from a design perspective….

There has been SO MANY CHANGES and all of them have really been for the better.

While we didn’t have a hard time teaching players the controls, we did have issues finding an intuitive control scheme. After a short meeting, we found a way to accommodate and  implement a new control scheme that is both intuitive and easy to teach.

OLD SCHEME:

Warping – menu button

Teleportation

Grabbing – grip buttons

Grabbing

Talking/Interacting – trigger

X3QXBN8C_400x400
I don’t have an image for this one

Open Journal – pressing the bottom half of the touchpad

Journal-out

Flip Journal Pages – pressing the left and right sides of the touchpad

Journal-flip

There were many problems with this scheme aside from the fact it wasn’t intuitive. The tracking on the touchpad was always a problem. If the entire surface of the player’s thumb wasn’t on the touchpad, the button press wouldn’t register. Since they had to click in a specific area of the touchpad, most players would click the edge and not put their entire thumb on its face. Also, the grip buttons are hard for players to get used to and people would mostly press the trigger even after we showed them the grip buttons in real life.

NEW SCHEME:

Warping – grip buttons

Teleportation

Grabbing – trigger

Grabbing

Talking – menu button

Interaction

Open Journal – press down on touchpad

Journal click

Flip pages – swipe left or right on touchpad

Swipe Journal

With this new scheme, the player instinctively presses the trigger to pick things up and we’ve re-located the warping to the grip buttons, which are easier to click and release rather than click and hold. The player is now able to press down anywhere on the touchpad to open up the journal, and swiping left and right feels much better than pressing left and right.

You can also see that we’ve colour coded the buttons. In that same, short, meeting, we decided that texturing the controller with coloured buttons (with influence from Google Eart VR) would help us teach the player. Buttons have symbols too. The interaction button has the dialogue bubble, the warping has footsteps, and the journal has…well…a journal.

There’s still some small fixes left to do, but aside from that, the rest of the year I’m making sure everything we want gets in the game, but also going into somewhat of a QA role, because as Ron Swanson said:

7baa58323ac971020f75688532454e7bb1cf899fe45074e4e4356a23dd28a59d

…Back to work!

 

 

Log #4: From Functional to Awesome

Back from vacation and back to work. There’s no way of easing into it. We have so much to do and so little time.

So Many Things To Do List

Above: The truth

From a narrative perspective, there’s lots of dialogue to finish writing. Everything is written for Level 0 (tutorial level) to be completed but after the Alpha build, we felt that more dialogue was needed to immerse the player in the world that we’ve created. The player has no prior knowledge of any of the mafia families that are involved in the case, so I need to enrich the backstory for the player’s experience. This is fine since we only recorded placeholder voice overs and haven’t cast proper voice actors…yet.

We’ve done our first round of auditions for the characters in our game and will be done with the second round tomorrow and be able to fully cast the characters by Friday. We figured that casting now and recording sometime in February once all the dialogue is finalized would be the best thing to do, as it gives us enough time to edit and implement it all before mid February. The plan as of now is to have the game fully beatable by February 10th (week 6) and from then on work on polish and QA. The more time spent on polishing the game, the better.

From a design perspective, this game still has lots of work needed. The tutorial, while functional, is not complete. What we’ve found through play testing is that it’s too non-linear. The player can use any of the abilities at any time, and that can inhibit the way they learn the controls. Sometimes they’ll press a button by accident and teleport right beside the wall and get disoriented. This is especially true since they’re facing Sal Lucio, who is up against a wall. We fixed this issue by moving Sal’s position, but we also want to implement something in the game that directs the player to warp to a specific area. This way the player will have to learn how to aim the warp as well as what button to use.

Aside from that, there’s also UI fixes we need. The panels that come up to show the player what button to press are very large, and the dialogue boxes for subtitles are large as well. There’s lots of clipping of both of them whenever the player moves, either into other objects or each other. We have to brainstorm a solution for that. Since it’s not the final UI, we can worry about it later since it’s more of a polish thing.

dog-memes-blind-dog

Above: Players struggling with our UI or warping into walls

Level 1 is now in the editor! The disco club, Diz-Qo, is very bare bones right now and needs lots of assets. As producer, I have to make sure everyone stays focused and starts pumping out the assets we need in order to populate the level. There’s still some design issues we’re going over with the club. Since most of the evidence right now is in the backstage area, including the dead body, the player really has no reason to go upstairs aside from the general, “I want to explore and throw things in VR” reason. Something to think about as we move forward and bring this level to life.

bring_me_to_life

Above: Level 1, waiting to be brought to life.

Log #3: The alpha build

Since the last update, we’ve put the entire tutorial level into Unity! Both phases! It’s been hectic and a lot of work, but we’ve managed to pull it off. We decided to use Articy and a Unity plugin for our dialogue system, which has helped us a lot.

For the investigation we’ve implemented:

  • Free warping
    • the player can warp around the scene and is aware of the playspace, it works really well, but some people decide to stand and not walk around. To solve this issue, we have to put items of interest in places that will require the player to bend over, such as under a table
  • Dialogue system
    • this is for NPC conversations and inner monologues for the main character, so far the UI is not final but the dialogue boxes we have are good enough
  • Notebook to keep track of evidence found
    • the player can open up the notebook, add and remove evidence to it, and flip the pages left and right

For the interrogation we’ve implemented:

  • Dialogue trees
    • again, using Articy was very helpful for this
  • Presenting evidence
    • the player can present any evidence when prompted, adding lots of freedom to the interrogation

We had a control scheme that made sense to us for the investigation phase, just to get things working and functional. However, we found that the controls were not very user friendly. The rate of error with the dialogue system were many (even within the dev team), and having the trigger used for multiple things would make the player drop items they were holding sometimes or do other things they didn’t want to do. Since then, we’ve re mapped the controls. The rate of error is way down, but the biggest problem is that we’re using the grip buttons to grab items, as the trigger is used for other things. The trigger is more intuitive as players who have prior console experience are familiar with triggers on controllers, and no controller has the grip button like the VIVE does. It’s unfortunate, because the grip buttons are a really cool feature, but no one knows to use them. We are currently working on implementing a tutorial with dialogue, UI, and voiceovers to teach the player what the grip buttons are and to use them. That’s the biggest challenge. We hope to have a build ready to test by next week to see how the player responds to the tutorial.

In order for this alpha build to be feature complete, we need a couple more things in the game. Aside from the tutorial, we also need to put all the dialogue into the game. This is something that is easy to do, but time consuming. I have to do a pass of all the dialogue I’ve written so far and add much more. The interrogation we have in the game was very basic and more of a placeholder dialogue tree. We also need to work on the flow of the game. Right now, the two phases are working but there is nothing to tell the player when they’re done with the investigation and when they can go to the interrogation. We plan on having a “crime desk” phase in between, prompting the player to rest if they need to take a break from VR and to review evidence and character profiles. This crime desk will be like the police station with Jack Conway’s office being accessible. We would have the tutorial diagrams (which are also in the tutorial in the level) on the wall on a corkboard, making it look like the images are part of the scene.

We tested an earlier build of the game at Gamma Space TO and got some great feedback from a couple game devs working in VR and the head of the studio, Henry Farber. He genuinely loved the game and thought the mechanics were all fleshed out really well, and loved the direction we were headed in for the UI. He said combining digetic and non-digetic objects was clever and it’s something he would love to see in a VR game and asked that we send new builds when we can, which is great since we’ll want more feedback!

Originally, we thought we could have level 0 (the tutorial level) and level 1 in the game by the end of this semester, but we decided level 0 is more important because there is more work to do teaching the player what to do in the game.

There’s so much work left to do in the next two weeks. We’ll see how it goes!

Log #2: Investigating Investigations

So we’re a quarter of the way through capstone and since the last post, we’ve been working on prototypes. As mentioned before, we have two phases to prototype: the INVESTIGATION and INTERROGATION (words that are are so similar it seems that everyone on our team is talking about one but say the other out loud. It’s not confusing at all…)

We’ve spent a lot of time prototyping the mechanics, especially for the investigations. We’ve INVESTIGATED the best ways to do teleportation from quadrant to quadrant (hence the title of the log post) and found two ways of doing it. As the lead designer, I’ve tried to spend a lot of time in the editor assuring that the teleportation feels nice and doesn’t cause motion sickness. So far, so good. The only thing we have to worry about now is how it feels with the scenes populated with evidence and red herrings all over. Designing around this teleportation mechanic can be limiting but I’m confident we’ll do a great job.

What I believe is that we’re in a good position right now, we have a lot of the mechanics working and assets in development, but we have to start prototyping the tutorial level to see how it flows. From there we’ll find more problems to solve, which is a good thing. There’s lots of pressure on us as a team, but we’re in pretty good shape.

this-is-fine
Above: An accurate depiction of our current state

 

What I’ve done as producer (that I may not have mentioned in the previous post) is made a bunch of spreadsheets to track assets, especially art assets and voiceover lines. These have been very useful for everyone to see how far along we are in our asset production pipeline. Here’s a screenshot of one of the lists:

screen

I’ve colour coded a lot of things and made a lot of lists to help. People don’t like typing the same thing over and over, so I thought that would help. Each artist has a different colour when they type their name into the cell (e.g. Jostein types his name into a cell in the “Modeler” column and that cell will turn green). This is helpful because they look for the colour rather than the text and they can use it as a more accurate representation of what their workload is. I’ve also done the same thing for the “Priority” column, so if something is high priority it will be red and low will be green. There’s also lists for the status of the model for the pipeline, such as not started, in progress, and to be textured.

We’ve also made ourselves a Trello board which is basically our digital Kanban board, but we’ll also be using it for bug reports in the future. Below are pictures of the Trello board:

tasks
Here are our tasks/backlogs

We’ve split the backlogs so everyone has a discipline specific one to look through instead of one long one which would require more scrolling. Each task has a description and some of them have checklists for the stages of the task (e.g. all of the objects to model in a level).

kanban
Here are the columns for Kanban

What we’ve also done is added a column for “prototype questions” and each has a card for different things we plan on prototyping (movement, dialogue, interrogation system, sound, etc.)

So that covers production and design stuff…now writing! I’ve written dialogue for the tutorial level’s investigation and interrogation phases. The investigation is much easier to write, because it’s basically  an intro, dialogue that plays when something is picked up, and an outro. Writing for the interrogation was more difficult because it is non-linear, as the player has a choice of what to say. The tutorial interrogation was easier because we’re limiting the player in what they can choose (if they choose the one they’re not supposed to, dialogue will tell them they’re choosing the wrong one and force them to pick the right one) but for level 1, it will be more difficult and will require more of my time. However, I found that using Twine for this simplified things as I can write outcomes for each decision. I used Twine for the prototype of the tutorial interrogation as well:

twine_pic.PNG

Up next as far as writing goes is finalizing the dialogue for the tutorial level and finishing the level 1 dialogue for the investigation and interrogation phases.

Overall, we’re all being productive, and I still have faith that this game is going to be great! More faith in this team/game than I had in the Blue Jays’ playoff hopes. They sucked. Pitching was great though. Time to cheer on the HABS!

sadjays.gif

Deep Dive and Annotated Bibliography

Deep Dive Statement:

Many games that have been well received by players and critics alike are non-linear or branching games that have fantastic character development in them. Whether it’s the main character or NPCs, there are reasons that players fall in love with the stories and arcs provided by the characters. The purpose of this paper is to find the reasons using various games and articles as examples and to try and find the formula for success which will help me define myself as a game designer with narrative design capability, since my end goal in the game industry is to be a designer that also writes for games I design. Some of the games I may reference are the Mass Effect series, Deus Ex: Human Revolution, Dishonored, Fallout 4, Dragon Age: Origins, and The Witcher series.

 

Annotated Bibliography:

Freed, A. M. (2014, March 12). Designing Stories for Nonlinear Game Segments (or “Civic Planning for Side Quest City”). Retrieved October 13, 2016, from http://www.alexanderfreed.com/2014/03/12/designing-stories-for-nonlinear-game-segments-or-civic-planning-for-side-quest-city/

This article describes designing stories for non-linear games. It has different sections that explain what a game story needs to have from a design perspective. It’s very useful as it has a lot of techniques and suggestions on how and when to add character arcs and development into the plot of the game, such as side quests.

 

Freed, A. M. (2013, March 21). Developing Meaningful Player Character Arcs in Branching Narrative. Retrieved October 13, 2016, from http://www.gamasutra.com/view/feature/188950/developing_meaningful_player_.php

This article is about developing character arcs in branching narrative. It is very helpful because it describes the fundamentals of character arcs and how to use them well in a branching story. The author of this article (and the one above) has previously worked for BioWare, and has lots of experience in this.

Log #1: And so, Detective VR begins…

I’m so excited to be working on this game for capstone! This post is going to be more about production/design and future posts will be more about narrative and design since my roles are that of producer, game designer, and writer for this game.

Backstory:

I’ve always loved the detective/murder-mystery genre and since first year I was thinking about possible games I could make for capstone. I knew I wanted to do either something in this genre or the superhero genre, since that’s popular right now and will continue to be popular in the future for many reasons. However, during my co-op are Minority Media Inc. in Montreal, I worked on a VR game called Time Machine VR and while there, I thought that investigating a crime scene in first person on the VIVE would be really cool and lots of people would like that kind of game. Thus, the idea for Detective VR (working title) was born!

22489608

Above: Me when I thought of this idea, also Gene Wilder in Young Frankenstein (R.I.P)

Production/Design:

Since the idea was mine and people seem to like me for some reason (I’m still not sure why) the team kind of accepted me as the producer. I don’t think we even had a formal discussion or vote or anything, but since I’m the lead designer I’ve also been doing most of the documentation for the game with some help from other members when needed. Joe is kind of an assistant producer since he’s been there helping me write them mostly.

I think the charter and the game mechanics feel documents really helped with both the production and design of this game. Originally, the plan was to have a full game, one tutorial with 5 cases (or “episodes”) to solve. This was because we wanted to have an overarching plot with a twist midway through the last case that would shock the player so much that they’d be like this guy:

giphy

We then cut it down to the tutorial plus three episodes, eliminating some filler so that the overarching plot was still in tact but there would be less character development. From a coding standpoint, this made sense because the player would be using the same mechanics in each case and just be in a different environment. However, the game would have less polish on the art side. After a very long and drawn out 15 minute conversation, we decided that it would be best if we cut the game down to the tutorial and one episode. This way the artists have lots of time to work on assets and even populate the environment more to make the player feel as if they’re a part of the world that we’re creating. From a narrative perspective, it gets rid of the overarching plot aspect, but I can develop the characters more through added dialogue in the tutorial and the first episode. If we’re all really happy with the end product and want to continue developing this game into a full game after capstone, we’ll add a “to be continued…” or something at the end of the first episode. This also makes more sense when we’re showcasing the game, as no one would have time to play the full game if it had close to an hour of gameplay due to the plot.

Designing this game has been really interesting so far. Aside from my work on Time Machine, none of us has experience developing in VR so we’re all pretty new to this. I started at Minority while they were almost two years into development, so even I haven’t done and design from the ground up. What I’ve always liked about detective genre is that there seem to be two phases; investigation and interrogation. This is true in video games like the Ace Attorney series and L.A. Noire and many TV shows/movies like CSI and The Usual Suspects. Well, The Usual Suspects had more interrogation than investigation, but I digress…

We’ve decided to have investigation and interrogation phases for each case in our game. The big question for us to answer before the greenlight presentation was “Why VR?”. The investigation already made sense in VR, the play would walk around and interact with objects at a 1:1 ratio with real life and feel immersed, but the interrogation was at first designed to just have a dialogue tree and choose from three responses: passive, agressive, and silence. However, choosing dialogue from a tree doesn’t use the VR technology to its full potential, so we decided that gestures made with the motion controller will be used to select dialogue. As of right now, we have the passive choice as picking up a coffee mug and the silence choice as writing down fake notes on a notepad. We had the agressive choice to be slamming your hand on the table, but since there would be no table in real life, the players would feel disconnected as their in-game hand hit the table and their real hand didn’t hit anything. Ipso facto, that’s the answer to the “Why VR?” question. Motion controls.

Moving forward, we’re going to have to test, test again, and test some more to see what feels right for the player and what works for the game. There’s a lot of work to be fun but I’m really excited!