Art, Virtual Reality, Resources

VR Painting 101: How Do I Become a VR Artist?

This is the question I get asked the most by fellow artists. I know being a VR painter can sound scary because it involves a PC, hardware you're not familiar with, and hella wires. But it's not that hard! This post is for artists who are not as tech savvy but want to learn more about painting and creating in VR!

I'll walk you through the things you need to know:
the terminology so you can understand what people are saying,
the tech involved,
the creativity apps for VR, and
- my favorite  VR artists you must follow! 

To answer your question: you just jump in and get painting! 


How Did You Get Started? 

I was exploring VR in early 2016, going to tech conventions and conferences, trying out every demo I could. Every person who worked in the VR industry, I spoke extensively to them, asked their advice. I didn't know of anyone else painting in VR professionally at that time (the Vive and Tilt Brush didn't officially release until Apr 2016). I knew I was made for this field. I dished out the money to build a new computer and got my hands on an HTC Vive! From there, I spent hours and hours in Tilt Brush, painting my days away. The rest is history! 

In short, I had the painting + comfort in tech to get started at the time that I did. I consider myself incredibly lucky with the timing of opportunities. 


VR Terminology, Part 1

  • Virtual Reality 
    This is when you are completely immersed in VR.
    Think: 0% opacity of the real world. 
    High-end headsets: Oculus Rift, HTC Vive, Playstation VR  

Job Simulator, an example of a virtual reality experience. One of my personal favorites! Notice how you don't see any of the real world. You are fully immersed.

  • Augmented Reality (sometimes called "Mixed Reality")
    This is when some aspect of the real world is visible (like Pokemon Go!).
    The AR content may or may not be integrated with the real world.
    Popular AR devices: Microsoft Hololens, Meta Glasses, your smart phones/tablets!

    NOTE: This is sometimes called 'Mixed Reality'. The nature of emerging fields is that the terminology isn't set in stone yet so things will be in flux in the coming years. 

An example of augmented reality, Holodog, a weekend Hackathon project I created with two other designers/developers. Left is my view using the Hololens — notice the digital dog is overlaid on top of the real life world (and doesn't exist IRL). Only I can see him in this scenario, using the Hololens. (Read more about our hackathon experience here.)

  • Errr..... Mixed Reality
    Combination of both real world + VR/AR
    Often used for film shoots or demo videos; videos to showcase content or use of immersive media. 

    NOTE: This is sometimes used synonymously with 'Augmented Reality' but means different things. The nature of emerging fields is that the terminology isn't set in stone yet so things will be in flux in the coming years. 
 This is an example of what a "Mixed Reality" shoot looks like. On the LEFT: This is what I look like during the shoot. I'm in a green screen room, I have a VR headset on. There is a 3rd-person camera recording me. On the RIGHT: there are apps that take the 3rd-person camera content + my VR content and outputs this view of me & my VR creation. (These photos are from the  Google Research blog  in the 'Headset Removal' experiment.)

This is an example of what a "Mixed Reality" shoot looks like. On the LEFT: This is what I look like during the shoot. I'm in a green screen room, I have a VR headset on. There is a 3rd-person camera recording me. On the RIGHT: there are apps that take the 3rd-person camera content + my VR content and outputs this view of me & my VR creation. (These photos are from the Google Research blog in the 'Headset Removal' experiment.)

  • "XR" or "xR"
    This is often used to categorize all the "R"s — virtual, augmented, mixed, etc. Some people are using this as an umbrella term for all the immersive techs.

Equipment You'll Need

  • Headset ("HMD" Head Mount Display)
    For the creative apps, you'll have to get either the Oculus Rift or HTC Vive. You can do a search online to see what the differences between the Rift and Vive are.

    I have both and switch back and forth. I like the ease of tracking for the Vive. I like the ergonomics of the Rift. Both are not perfect, as early tech goes. 
     
  • A VR-ready PC Computer
    The computer has to have a top-notch video card. This is imperative! The best one on the market right now is the GTX 1080 Ti. I tend to invest in the best computer parts in the moment in order to keep it as long as possible from this point forward. You can always upgrade parts as you go, too. Your videocard must be able to run at 90 frames per second. If it doesn't, you will get lag while in VR and you will likely get nauseous!

    TIP: You can find already-built computers that will say it is "VR Ready." Do a search on its video card and make sure it's all good.  
      
  • Physical space in your studio or room to do some VR!

VR Terminology, Part 2

Tech Key Words

  • Frames per second
    I mentioned this in the HMD notes above. How many fps does your video card have to run?
     
  • 3dof vs 6dof ("Degrees of Freedom")
    This pertains to how much of your movement gets tracked. In 360 videos, you can look up, tilt, and look around you — this is 3dof and you are limited to standing in one place for this.

    In 6dof, you can do those things PLUS you can move forward/backwards, left/right, diagonally in space. 
     
  • Haptics
    This is when your other sense are engaged: touch, taste, smell!

Content Vocabulary

  • Agency
    How much control the user has in the experience

     
  • Presence
    The feeling of "this is sooo real!!!", like you are actually there in that VR world
     
  • "Experiences"
    A lot of people call VR experiences as 'games'. This is not always the case. It's often a hybrid between an interactive game or a film or something else. I've found that "experience" can be a broad, encompassing word.

Creative VR Apps

There are a few out there but I'm mostly going to mention the four popular ones:

Google Apps

  • Tilt Brush — This is the one you probably saw Glen Keane using! It's a lot of people's first VR experience, and it's one I often use to introduce new people to, too. It's the most intuitive, quickest way to prototype / ideate / play and doodle in VR! 

  • Blocks — This is used to create low-poly 3D models within VR. With a few primitive shapes to build from, you can modify vertices, extrude faces, scale things huge or tiny. This is a superb app to use to quickly create assets for VR because it is low-poly.
    (Being mindful of poly count is super important for extensive VR creations that will be turned into films/interactive experiences/games/etc.)

Oculus Apps

  • Quill — You might have heard of or seen clips from Dear Angelica. A lot of illustrators prefer Quill's brushes and functionalities. You probably have also seen a lot of amazing videos by the great Goro Fujita! New for early-2018: there's an animation feature now! 

  • Medium — This is a 3D modeling tool in VR. Think: sculpting with clay in VR! The extent of how you can model with Medium is incredibly impressive. I've seen very intricate and complex models molded in Medium and then 3D printed out. So beautiful!

If you'd like to explore more VR creative apps, 3Donimus compiled a really comprehensive list


My Favorite VR Artists!

Liz Edwards
@lizaledwards
https://lizedwards.artstation.com/

Danny Bittman
@DannyBittman
http://dannybittman.com/

Vlad "VR Human" Ilic
@vr_human
http://www.vr-human.com/

Micah404
@micahnotfound
http://www.art404.com/

Sutu
@thenawlz
http://www.sutueatsflies.com/

Anna "Anna Dream Brush" Zhilyaeva 
@AnnaDreamBrush
https://www.annadreambrush.com/

3Dominus
@3Donimus
https://www.youtube.com/3donimus

Steve Teeps
@Steveteeps
http://www.steveteeps.com/

Naam
@_naam
http://sketchfab.com/naam


Final thoughts...

I know there's a lot and can be intimidating! Hopefully this helped ease some of your anxieties. It might seem like we know what we're doing in this space, but it is honestly very experimental and hands-on.

So get a VR rig, grab your controllers, and start swinging some VR paint! I can't wait to see what you make!

Art, Virtual Reality

VR Watercolor Plein Air Painting: I Painted With My Eyes in VR and My Hands in IRL

Liz Edwards has been plein air painting in Fallout VR! It's so inspiring to see her — well — kill off a bunch of enemies in order to just sit quietly and paint in peace. I was super excited to see her take VR art to this next level! I coordinated some time to meet with her in VR (one where I wouldn't immediately die on the spot lol) so we could have a paint session together.

We met in a multi-player VR gallery and settled into one of Danny Bittman's paintings. I chose a view where I sat in Danny's painting while looking out into the VR gallery space. 

Instead of digitally painting with Photoshop, I wanted to try traditional painting while in VR. 

Materials needed:

  • VR Headset (I used HTC Vive)
  • VR-ready desktop computer
  • Steam & SteamVR
  • A VR environment or place to go to.
    BONUS if you go to a place with multi-player, then you can plein air with friends!
  • OVRdrop
  • Your medium / weapon of choice. Mine was traditional watercolors, which included:
    • water cup
    • Schmincke watercolor paints
    • Pencil for sketching
    • Brush
    • Pen for any additional inking

 

The Setup:

This is a really odd thing. I'll try my best to describe it lol

 You will look  really  cool...

You will look really cool...

  1. Get all your VR stuff running
  2. Load up the environment you'd like to sit in. 
  3. Take off your VR headset.
  4. Launch OVRdrop
  5. Set the OVRdrop setting to display your Vive camera
  6. Set up your physical painting space with paints, paper, water in front of you. In the event of your water spilling (loool), make sure it doesn't knock onto any electrical stuff!!
  7. Put back on your headset.
  8. Adjust your OVRdrop window so it's a 'window' positioned where your paints are. 
  9. Notice how your hand-eye coordination will be a little bit off.
  10. Attempt to paint.
  11. Question why you are even doing this. Like, really. Why. 
  12. Contemplate the state of humanity while you are in a machine.
  13. Continue and finish painting without destroying your eye balls.

 

Things I learned from watercoloring while in VR:

  • It’s HELLA weird to notice when your hand-eye coordination is off. In this case, it was my EYES (the Vive camera) placement that was off! It’s really uncanny. 

  • It’s also a very strange feeling to be painting with my hands as seen through a monitor. It felt like this:

 Cross-dimensional arm magic! ✨

Cross-dimensional arm magic! ✨

  • The saturation + vividness of the Vive camera was really off. There would have been very little chance of getting the exact colors right. 
  • It’s REALLY a strange idea that things were CLEARER and CRISPER in the VR space than IRL. SEEING the real life thru a blurred filter was really odd. My eyes were definitely happier to ‘rest’ in the VR space than staring thru the pixelated display of IRL.
  • This would be a VERY interesting exercise in getting rough design ideas down. It would be challenging (and probably HORRIBLE for your eyes) to try to do a lot of detailing. This would be a great exercise in value-grouping, learning how to paint loose to get the idea down. 

 

How might this be used?

Well, Liz Edwards has been going into Fallout 4 and plein air painting with OVRdrop>Photoshop. I think she may have painted in Google Earth as well. (The OVRdrop window would be wayyy clearer with Photoshop, rather than showing the out-facing Vive camera like I used.) 

Using Photoshop, artists could hop onto into Google Earth VR, sit at the top of the Eiffel Tower, and get a pretty realistic understanding of the perspective from up there. Or maybe sit on the Seine River, looking up at Notre Dame. 

With “VR+Traditional Plein Air Painting” (I mean, what do I even call this? Trans-dimensional painting? Multi-reality painting??!), perhaps when the camera and resolution get better (Vive Pro?) it’ll lend itself to some more interesting paintings. There is still the absence of real LIGHT as it affects color and shadows as in nature. But perhaps this and the Photoshop method can lend itself to blocking in roughs. At the very least, it allows the artist to *feel* the environment, mass, form, and depth in a 3D space. 

 

My First "Multi-reality"(??) painting! 

I'll likely try a few more experiments! Here's my first VR plein air watercolor piece, and the full video of me and Liz discussing the weirdness of all this in the full youtube vid:

 Pretty strange, huh?

Pretty strange, huh?

Thanks to Colin Northway and the MOR team; to Danny Bittman for letting us plein air in your piece!; to Anand Duncan for letting us plein air in your absolutely gorgeous VR dresses!; and Liz Edwards for being a splendid paint buddy! 
 

Art, Virtual Reality

Mixed Reality Headset 'Removal' with Google Research

Soon after the launch of the Tilt Brush Artist in Residence program, fellow VR artist Steve Teeps and I were asked by the Google Research and Daydream Labs team to participate in some brand spankin new technology for mixed reality! It's best if you see it here, as explained by Tom Small and Avneesh Sud: Here's a visual breakdown of what all the different phases look like:

  1. Green Screen Video: this is what it looks like to ppl on the outside when I'm painting in VR.
  2. Virtual Environment: I can set the VR camera to show my painting and where I am in that space (the glowy Vive controllers), but still doesn't effectively depict what is going on.
  3. Traditional Mixed Reality Output: this shows my VR painting and me in the shot. But the VR headset is isolating and blocks the human connection aspects of this.
  4. Mixed Reality + Headset Removal: the Google Research team overlayed my eyes on top of the headset! You can SEE me again! :D
estellatse-mixedreality01.jpg

Here's a moving image example:

estellatse-mixedreality02-1.gif

You can read all about this on Google's blog here:

Google Research and Daydream Labs: Seeing eye to eye in mixed reality

For more specifics of how Avneesh Sud and Christian Frueh pulled off this magic, read all about it on the Google Research Blog here: Headset “Removal” for Virtual and Mixed Reality (The above images and video originated from these Google blog posts.)

Thanks so much again, Google, for asking me to help out! I had a blast with the YouTube and Research teams! A few friends have joked that Teeps and I have literally become the face of the face of VR. hahaha

Behind the Scenes!

IMG_6210.jpg
IMG_6181.jpg
IMG_6194.jpg
IMG_0718-e1492151821719.jpg

Sharing with my 96-year-old Grandma ^-^

Even more crazy is that my dad found me in the Chinese newspaper! I didn't believe him at first, thinking he was talking about the general 'you', perhaps speaking about any VR related stuff.

Nope. That's me! This was the moment my immigrant parents finally realized that what I'm doing is actual legitimate (because the Chinese newspaper is, like, the bible, amirite?). We showed my 96 year old grandma the newspaper article. This is also the moment when she realized that I'm doing something big, too.

EstellaTse-ChineseNewspaper.jpg
yenyen.jpg

Grandma is the sweetest. She smiles, cackles her signature laugh, and gives me a thumbs up. And what was the first thing she has to say after seeing all this in the newspaper?

"Oh, you didn't wear sleeves!" 

<3

Art, Virtual Reality

Tilt Brush Butterfly Metamorphosis

When I had the very honoring opportunity to join Tilt Brush for their Artist in Residence program, I knew I had to make something different, and that this would be one of my biggest VR pieces yet.

Designing My Problem:

  • I had to design something within Tilt Brush.
  • I wanted to create something that other people weren't doing.
  • I wanted to create a piece that utilizes the Tilt Brush playback feature.
  • I wanted to create a meaningful piece that was more than just a pretty picture; I wanted people to experience both the technology and something magical. It should be something anyone can view and understand. It needed to touch the hearts of the viewer.

Searching and Working Towards My Solution:

  • Everyone making pieces in Tilt Brush were making really jawdropping finished pieces. The finished pieces were the main focus of the art. I needed to do something different. I needed to create a piece that focused on the PROCESS rather than the finish. That is, the process and growth of the piece needed to be much more interesting than just viewing it as one static image.
  • It needed to be a narrative in some way. It didn't need to be a crazy complex thing. It just needed to show something GROW. Like a normal storytelling narrative, it needed a beginning, a middle, an end. Super basic, super fundamental for anyone to understand. No learning or backstory necessary.
  • It needed to feel like the piece was appearing out of thin air, it needed to be MAGIC. So I planned it out, the painting process / choreography. It needed to be seamless. It needed to LOOK effortless. It's like hiding the strings to a magic trick — I had to conceal my secret for how I did it. People WANT to believe in magic. People WANT to be dazzled and amazed. So I played on this.
  • Lastly, it needed a personal message. As an illustrator/image maker, I know the best ones are when I draw from my own experience. I went thru a lot of reformative changes in the last few years, a LOT of letting go, a LOT of reshaping. Yes seeing a butterfly emerge is cool to see, but it needed the personal touch to really wrap it all together to show the PURPOSE of the piece. This is the part that touches people's heart. It's not just another pretty painting — there is MEANING behind the content, the process, the message that makes it ALL beautiful in one cohesive piece.
  • The butterfly metamorphosis was the best process that fit ALL of these. It's simple. Everyone knows it. Everyone has gone thru changes, or they will at some point.

How Did I Paint It?

  • I spent several days planning out the composition, the look and feel, and how I would paint it all in order. I can't understate the amount of planning necessary.
  • A few trial runs followed.
  • Then one go at the final piece!
  • The Tilt Brush playback feature shows every stroke in the order of how you paint the piece. I had to paint each phase step by step — this was indeed a narrative and performative piece!

More:

My piece is also featured in the Tilt Brush showcase (in the app) and on the Tilt Brush Artist in Residence page.

You can read more about the Tilt Brush AiR program's inception here in this NYT article.

Augmented Reality

HoloDog Development! SF VR Hackathon

A month ago, I attended my first VR Hackathon, not knowing what to expect, nor where I would fit in. I consider myself new to Unity. I can draw really well. I can dev. And I sure as hell can learn really fast!

I teamed up with two lovely ladies, Katie Hughes and Nidhi Reddy. We teamed together, spent the first night listing out all of our ideas of what we could create in the next 48 hours. There was a competition component to the Hackathon, but honestly, we just wanted to LEARN and just head towards our goals!

With amazing camaraderie, communication, playing on our strengths, and respect for each others' process, we were able to go from knowing minimal Unity to building our first run of HoloDog!

Katie was our idea guru. She is FULL of ideas, and we were able to create a compelling reason for creating this (to be unveiled in the future)! Katie was our strategist, figuring out how the UX of the HoloDog would play out in future iterations.

Nidhi is our super amazeballs AR developer. She learned how to do the speech-integration with the HoloLens in a day, and endlessly troubleshooted the deployment and user interaction portions of development. She handled all things HoloLens.

I played the role of "time keeper" to make sure we were hitting our goals, checking in appropriately throughout the weekend, maintaining good communication with each other. I was also the "dog wrangler," handling all things pertaining to dog interactions, from the model to developing the trigger responses within Unity. I was responsible for all things related to the actual HoloDog.

Here's a little bit more about our process and how we built our beloved "Buster, the HoloDog." :) omgwtfamidoinghalpOkay. I'm in charge of the dog. 

Okay.

What does that even mean. Where do I begin. What. How do I make a virtual dog come to life. What. Halp.

Step One: Find an asset.

Step Zero: Practice being an optimistic learner. Understand the Basic Workings of 3D Animation

Well, me being my "I wanna make everything from scratch" self, I wanted to see if I could rig and animate the dog myself. Yes, in less than 24 hours. That's how optimistic I am. So I called up my 3D technical artist friend for advice on where to start. He pointed me to this Lynda.com Maya tutorial on how to rig and animate a dog.

Mmmhmm. So, 10AM on Saturday of our weekend Hackathon, I jumped into learning Maya basics (from UI to functionality), and then onto rigging. I spent a few hours fully immersing myself in learning this very crucial part about asset making.

Step One: Find an asset.

So then I realized I wouldn't have enough time (I know, I know. I can try, right??) to learn everything and also to naturally animate a quadruped. Around lunch time, I revised my plan and looked into existing dog assets on the Unity store. For the goal of having a working prototype by Sunday noon, we found this really great 3D dog pack by Nitacawo. It had sit, trot, barking, etc animations. It was perfect! We began building with our lovely beagle, Buster! :D

Step Two: Plopping Buster into Unity

  1. Download Unity. It's free, and almost all AR/VR things are built with Unity. It's a really critical software to learn, if you want to dev for AR/VR!You'll have to learn some of the basics of Unity. Luckily, they have documented really helpful tutorials. If you're getting started with Unity, this should help: Interface & Essentials. From there, maybe do the Roll-a-Ball tutorial to learn the game engine interface in application.
  2. Create a New Project. Mine is called "DogTest" hahah
  3. Save your New Scene. I saved it as "DogScene"
  4. Import the Dog Pack Asset into your project. In Unity, go to your Project window. In the "Assets" folder, create a new folder called "Models". Drop the 3D Dog Pack into the "Models" folder by dragging from your Finder/Explorer window.You can also import by going to menu Assets > Import New Asset.. Make sure you keep your 3D models in the "Models" folder to keep things organized and happy. :)
  5. Drop in Buster! From the Assets > Models folder, open up the DogPack. Open up one of the dog folders. I selected "Beagle" for our model HoloDog.The first thing you'll see in that folder is named "Beagle" Drag and drop that into your Unity scene.Ta-dah! You should see your dog there now! You should see a GameObject listed as "Beagle" in the Hierarchy window now, too.(You'll see some child-elements associated to the "Beagle" GameObject. One is called 'Beagle' and the other is called 'Bip003'. We won't be touching those, so no need to worry about those. We'll only be working with the parent "Beagle" GameObject.)Fantabulous! Time to give him some behaviors!

Step Three: Attach Behaviors / Animator Controller to Buster

Here we're going to create the different animation states for the dog, associating the animations to the states we create.

  1. Create a new Animator Controller In your Unity Projects window, in the Assets folder, create a new one called "Animations"Create a new Animator Controller by right clicking in your new "Animations" folder to find it in the "Create" > "Animator Controller" popup menu. You can also find it by going to your Projects tab and looking for the "Create" button.Name this "BeagleAC" and double-click on the file. You should be in the "Animator" tab now.
  2. Associate the new Animator Controller to Buster In your Scene window, click on Buster the dog.Do you see in the Inspector window the Component called "Animator"? There is no controller associated to the GameObject.From your Assets window, drag and drop "BeagleAC" into the Inspector > Component > Animator > Controller field. This will link the new animator controller to the GameObject.
  3. Create the Parameters for the animations In the "Parameters" window, look for the little plus (+) symbol. Click on it and select "Trigger". Name this trigger "Sit".Create two more triggers: "Up" and "Speak".
  4. Drop in the "Sit" animation clip From your Project Assets window, go to Models > DogPack > Beagle and look for the file called "Beagle@BeagleIdle".Drag and drop this into the "Animator" window. You should see it show up, and it'll be highlighted orange. This means that it will be the default state when the game starts.Then, drop in the file called "Beagle@BeagleIdleToSit". This is the animation for a standing dog to sitting dog.
  5. Create a Transition from IDLE --> SITRight click on the orange BeagleIdle bubble. Select "Make Transition" and an arrow will show up. Click on the "BeagleIdleToSit" bubble.Click on the arrow itself. The "Conditions" window associated to the transition will be blank. Click on the (+) button to create a new condition. Select "Sit"

Yay! The animation part is set up and ready to go! Time to type up the trigger script so that the dog knows WHEN to Sit.

Step Four: Write the Trigger Code Behaviors

  1. Create a New Script. In your Inspector window, click on "Add Component" > New Script and name it "BeagleScript". Click on "Create and Add"
  2. Open the New Script. Double click this newly created script to open up your script editor.
  3. Name the Animator variable and give "Sit" an id. This first bit is to give our animator variable the name of "anim" to contain our animator object. We're also calling on the parameters strings called "Sit" and giving it an id to be used in the code.
  4. using UnityEngine; using System.Collections; public class BeagleScript : MonoBehaviour { Animator anim; //calls on "animator" variable, called "anim", to contain animator object int sitHash = Animator.StringToHash("Sit"); //gives an id to the parameter "Sit" so it can be called upon int upHash = Animator.StringToHash("Up"); // Use this for initialization void Start () { anim = GetComponent (); //references the animator object called "anim" } // Update is called once per frame void Update () { if (Input.GetKeyDown (KeyCode.Space)) { //when "space" key is pressed, Trigger the animator transition parameter called "Sit" anim.SetTrigger (sitHash); } } }
  5. Reference the Animator Object Then, we reference "anim" to the Animator Object in the Start function.
  6. using UnityEngine; using System.Collections; public class BeagleScript : MonoBehaviour { Animator anim; //calls on "animator" variable, called "anim", to contain animator object int sitHash = Animator.StringToHash("Sit"); //gives an id to the parameter "Sit" so it can be called upon int upHash = Animator.StringToHash("Up"); // Use this for initialization void Start () { anim = GetComponent (); //references the animator object called "anim" } // Update is called once per frame void Update () { if (Input.GetKeyDown (KeyCode.Space)) { //when "space" key is pressed, Trigger the animator transition parameter called "Sit" anim.SetTrigger (sitHash); } } }
  7. Write the trigger script. Then, we write a trigger so that when the "Space" key is pressed, the IDLE->SIT animation is triggered.
  8. using UnityEngine; using System.Collections; public class BeagleScript : MonoBehaviour { Animator anim; //calls on "animator" variable, called "anim", to contain animator object int sitHash = Animator.StringToHash("Sit"); //gives an id to the parameter "Sit" so it can be called upon int upHash = Animator.StringToHash("Up"); // Use this for initialization void Start () { anim = GetComponent (); //references the animator object called "anim" } // Update is called once per frame void Update () { if (Input.GetKeyDown (KeyCode.Space)) { //when "space" key is pressed, Trigger the animator transition parameter called "Sit" anim.SetTrigger (sitHash); } } }
  9. Save the script.

Step Five: Testing!

  1. Return to Unity.
  2. Press "Play" to run the game.
  3. Press the Space key on your keyboard. Does Buster sit??! Does it look something like THIS?!

Step Six: Pass it to Nidhi!

I repeated the above steps to associate "Up" and "Speak" and "Play Dead" to Buster. I wanted to make sure all of the script and Unity components were all in order before passing it off to Nidhi, who tackled all of the HoloLens and speech-recognizing portions of this project. Wizard! She'll be posting her portion / experience from this Hackathon soon -- I'll be sure to link to her when she has it up! :)

This is a three-part roundtable blog post with Katie and Nidhi! Read all about Katie (UX and event roundup) and Nidhi's (HoloLens development) experiences!