Sunday, June 11, 2017

Raspberry pi beginnings: Hardware, soldering, and installation

So, out of the box, I had realized that I made a huge error in understanding what all I needed for this project since I was working with the raspberry pi zero.  First problem was that we realized that the micro usb and micro hdmi ports required cables we overlooked.  The battery handled the micro power cable so that was not a problem.  We made sure to go out and get a micro HDMI cable and a micro usb converter.  This was honestly a huge oversight.  When working with hardware this small, it's important to remember to know exactly what you need.  Not just for the pi, but to develop on it.  Also in this time, I ordered a bread board with some wires to begin laying out the circuits for when we solder the rest of the pi.  After getting these pieces, my partner Cameron Schwach and I were ready to begin soldering the female header pins.

Lucky for me, Cameron had ordered a soldering iron of his own before even coming to FIEA.  We heated up the iron to 350 and began testing on another header pin before soldering for the actual pi.
One other thing we realized was that with a pi this size, it became hard to get the solder onto the pins accurately.  Cameron realized we needed to use the flat head attachment for the soldering iron.  The correct technique to get the solder onto the pins is to hold the iron directly on the pin and then add then place the solder onto the hot iron.  This makes the solder become attracted to the pin and much easier to hand off.  The important thing to keep in mind is that the solder needs to create a bond for each pin to the gold ring pin holes on the pi.  

One word of caution.  The iron is very hot so a slight burn may not hurt that much, but you risk dropping the iron and potentially damaging what you are working on.  For this reason, it is wise to wear a glove on the hand that is holding the soldering iron.  Also, in order to keep the raspberry pi from moving around while you are soldering, we added sticky tac to the the bottom side and stuck it onto a piece of cardboard to keep avoid damage to the table were working on.

The soldering went very smoothly.  We successfully soldered on the female header pins to the bottom of the pi and had the finished product below:
With the header pins in place, I was ready to begin programming the pi to get some input from the buttons I had purchased.  However, the first step was to get the development software onto the pi.
I went with raspbian which you can install using noobs.  I followed the instructions on raspberry pi's website.

Following the instructions was easy and I placed the OS on a micro SD card then I place that on the pi and booted it up.

When booting up the pi for the first time, you will be greeted with a screen to select the operating system to install.  I selected the raspbian option and booted up to the main screen.

One annoying thing before going forward.  The pi can only take one input to the usb at a time.  This said, you will swapping between the keyboard and mouse frequently so be prepared for that.

Now, one of the great things about the raspberry pi 0 is that it comes with bluetooth and wifi on the device.  But one thing to remember is that you need to install blueman in order to use bluetooth.  Install with the following command:

-sudo apt-get install blueman

When prompted (Y/N)? type Y and hit enter

This should handle installation for bluetooth functionality, however make sure that your software is up to date.  You can do this by entering the following command 


This will open your config options  on the pi and you may see this screen

From here, you can do many things such as change your password to something you'll remember.  However, while we are here, go to the interfacing options to enable both your serial and ssh options.  These will be important for the next connection steps I will cover in the next post.

Lastly, update the pi with one of the options and then reboot the pi.

Bluetooth connection:
Now comes the cool part.  If you go to the bluetooth symbol on the top of the screen (Two over from volume) and click on it, you will see the option to make the device discoverable.  Do this and then go back to your computer and scan for bluetooth devices to add.

Note: your computer must have bluetooth in order to scan for it.

Once you find your device, click to add it.  You will then be prompted with a screen asking if the number of the pi is correct.  Click yes and then on your pi, it will say you have a request to join another device.  Click yes and what may happen your first time is that the pi will say there is an error and that you disconnected.  Nothing is wrong, you just need to try again.  Sometimes, the device will also give you an error after this step and it will wind up paired anyway.  Regardless, do not worry if it fails to pair the first time.  All other times should go smoothly.

Alright, you should have your device paired and you are ready like me to move on to the next step to program a serial port to begin sending data over bluetooth to your computer.  However, that is where this post ends.  We will cover Serial data and buttons in the next post.

More cools things to come!

Wednesday, June 7, 2017

Tracker Puck Struggles

I have begun using the tracker puck and have integrated it into unreal.  The process of installation and integration was very smooth.  In order to track the puck and attach it to an item, one must use the following node in blueprints.

One problem though, is that the puck does not just get assigned to the third tracked item in unreal.  There are now that many tracked items.  So, I needed to access the arrays of the objects tracked by type.  I discovered that the tracker puck is an invalid type object and is the first object in that array.  Once I grabbed the location of this item, it was easy to attach a static mesh to the puck and begin moving the ocarina inside of the room.

However, the tracker puck has some sort of offset with items in the room.  I still need to figure this one out.  For now, I have stuck with an offset of (-100, -50, 0).  The Z axis seems to be fine.  I have noticed that this same offset occurs with the hands and the camera if the ocarina is assigned to any of them.  Clearly, unreal must have some fix in their base VR blueprints that I need to dig for.  However, I needed to stay on schedule and focus on other things in order to hit my marks for this week.  I will post about them in another post.

Before I go, I will also urge caution to tracker puck users.  I was dissatisfied with the tracker puck not being treated as a controller by unreal.  Part of this was because I could not attach the puck to a motion controller like the hands for the Unreal VR pawn.  HTC had a fix on their site with a download for Steam VR to treat the tracker puck as a controller.  This can be found in the link here:

Now, if you want the puck treated as a controller, this tool will get that done.  However, once it is converted, the puck cannot be changed back to a puck.  The tool does not support that.  I have reached out to HTC and did not get a good answer to this problem.  However, I will be asking the devs in their blogs.

That's all for the tracker puck.  More cool things to come

Wednesday, May 31, 2017

Tracker Puck

The tracker puck is the first piece of the controller to arrive!  HTC has released a tracker puck which can make other controllers visible in VR.  It's really cool.  The photos online do not do it justice.  This thing is small, compact, and so incredibly light.  I'm very surprised.

This makes things way easier than originally thought.  We originally opened a Vive controller and were going to use the trackers inside to make the pi into  Vive controller.  

If the tracker puck proves to not have the same functionality we were looking for, we will try and use the pieces from the controller instead.  Really excited to begin working on this project.  The rest of the pieces should be in later this week.

More cool things to come!

Personal Programming Project

Today marks the starting point of my next project that I will be making for the Programming 3 course.  I will be creating an object that you can interact with in the Zelda room I programmed for last semester.  My group is making the room into an escape room with several Zelda themed puzzles.  To take this project even further, I am creating an ocarina controller with a raspberry pi that will connect to the computer using Bluetooth.  We will also use a Vive tracker puck to have the item be visible in VR and out.

My presentation for the proposal went very well and I was funded for my project.

Here is the supplies I will be using and the links to them:

Raspberry pi 0: $9.95
Infared sensor: $1.76 for 1
6 buttons from N64 controller
Tactile button pack X20: $2.50
Pin heads: $0.95
Battery pack 5 volt: $14.95
Vive tracker puck - $115

The first step will be to solder the buttons on and begin programming the pi.  All of the pieces will be going inside of a 3D printed ocarina case.  Stick to this blog because part of the project requirements is that I am frequently updating my blog about my progress,  
Lots of cool things to come.

Featured on the news

One of my projects, A heart of tin, was recently featured on the local news.  My project lead, Hannah Pogue, and I were interviewed and got to show the game to the reporters.  This was the first time I've ever been on the news so it was really cool.  Nice to have a project I have worked really hard on get some recognition.  Here is the link to the story.

Lots of cool things to come

Saturday, May 6, 2017

Half Done

Well, I am now halfway through my career at The Florida Interactive Entertainment Academy.  Time seems to really fly by here.  I've had a great opportunity working with all of the talented people here.  I have so many great projects to show for it and have overcome challenges I never thought I'd face.  This semester, I created a game to help families deal with depression for a class called gamelab.  This game was made for the HTC Vive and had some high quality art in it.  Video below:

Another project I had a lot of fun creating was Excommunicated.  This was a project for our project named The Gauntlet.  We had to design and create a game in 6 weeks.  However, some teams, including my own, were only as big as 2 people.  I had a lot of fun working with my friend and colleague Jerrick Flores to design a multiplayer experience in VR.  Nobody in the cohort had done this before, so it was an uphill battle learning networking with 2 Vives.  In the end, we pulled it off and had a great game to show for it.  Here is a video of that.

I also got to finish a side project that I had to scrape together in my limited free time.  This project was my own personal take on ILMX Lab's Trials On Tatooine.  Grudge Of The Mandalorian is a Star Wars Experience.  I was completely in charge of this project and ran a team of 10 artists to deliver this project to be as authentic as possible.  I learned so much about the unreal engine in this project and I even edited UE4 source code to remove those pesky black bars that Unreal renders VR games with.  Here is a video:

Anyway, looks like I'll be in the market for a job this semester.  So the great search now begins.  I'll also be working on my VR capstone "The Draft".  
More cool stuff to come!

Tuesday, May 2, 2017


So yesterday, one of the artists in the cohort asked me to put some intractability into the Zelda VR scene he made for his final.  This was a nice change of pace since I had just finished working on my finals.  This was quick, easy, and fun.  I was only asked to make items able to be picked up.  Instead, I added a little more.  Bombs that you pick up will light their fuses and will not explode until you throw them similar to Zelda.  I also added sound functionality to rupees when you pick them up.  I made the sword and shield snap to the hand position that Link would normally hold them in game.  I also made Navi in the scene shout phrases from the game if the player looks at her.  Lastly, I made pots destructible because you can't have a Zelda game without smashing some pots.
Anyway, gotta get footage from my other projects to update this site.
More cool things to come!

Wednesday, March 29, 2017

Unreal Source Code

So today I have dug into Unreal source code.  I discovered that there was a way to remove the black bars that from the screen when someone is playing a VR game that is made in Unreal.
Following this link, will take you to the location that I discovered my answer.
This was a success.  The new packages I have exported from my version of the engine no longer have those black bars.  Hopefully unreal responds to this support ticket soon.  I think this is a feature that every dev for VR needs.  When you want to show off VR to a group of people, only one can see the whole picture.  I do understand that this may impact the performance of the computer though.  However, it should be an option for demo purposes.

This was a cool little experiment since I have never built unreal from source nor edited the code.

More cool stuff to come!

Wednesday, March 8, 2017

GDC 2017

Last week I took my first trip to San Francisco for my first ever GDC.  It was an amazing experience.  I was able to chat with so many fellow devs from companies that I love.  One of my favorite interactions was with the people from audio kinetic who helped point me in the right direction of some audio issues I've been trying to tackle.  There was so much VR stuff to look at.  I got to finally try Playstation VR and check out Robo Recall with the Oculus touch.

Ultimately, I was a little let down with the tracking of the Oculus touch.  I ducked several times in game and found that the headset lost track of my hands multiple times which overall affected my speed and score.  It was a little frustrating and a little behind where the HTC Vive is in terms of tracking.  Perhaps it was due to how many trackers there were in the room I was in?

The Playstation VR was great as well.  I got to try a different flavor of VR experience I don't normally see.  I played a game that was sims-esque which I cannot seem to remember the name of.  Anyway, it was a very fun experience bar needing to pick up individual resources when there was an abundance of them.  Lots of great reference material for future VR designs.

I rounded off my trip with a tour of Lucasfilm.  It was an incredible tour and experience with a look at ILMX Labs.  Very exciting stuff although, I'm not allowed to talk about what all I saw.  Happy to get back to FIEA and get back to my current projects whihc both happen to be VR.  One new project on the horizon and I'm very excited about it.  I'll be sure to talk about it in future posts.

Lots of cool things to come!

Wednesday, February 1, 2017

Slicing Meshes in Unreal

Hey there!  So, for my last Tech Design assignment, I wanted to cut something in half.  For this assignment, we were supposed to create an experience using Blueprints.  We were then supposed to provide 5 enhancements.  Wanting to cut something, I figured I would try to recreate aspects of the plasma cutter from one of my favorite games, Dead Space.  There is a very deep emphasis on dismemberment using weapons in that game.  First, I looked into getting the aim sight look with the laser beam correct first.  What I did, was take a material and attach it to a particle I created.  I then would raycast from the gun and set the particle start point to the gun and the end wherever the ray is colliding.  As you can see, this works in the above gif when only one of the lasers is not colliding with the shelf.  Here are the blueprints of the laser below.  Note, it takes an input, so I am not reusing this code three times.

Now, onto the really interesting challenge.  In the unreal engine, there is a component called a Procedural Mesh.  In order to slice a static mesh, this must be a child component of that actor.  One important thing to remember with this component, is to have it not use complex as Simple Collision in it's options.  This prevents the whole thing from working.  Next, make sure that the parent is not doing any of the collision since we are having the procedural animation take the hit and split the parent into what is two separate meshes.  Next, if you want to continue slicing those meshes, one needs to make the constructor of these newly spawned meshes to have a procedural mesh and to be spawned with collision enabled.  Next, what I do, is make a collision trigger this slice.  I fire a bullet out of the gun, so when the component begins overlap with this bullet, it will call the slice method on the procedural mesh  which will spawn the new mesh after preforming the slice on the old mesh.  This slice can be seen below in the gif.

Another fun thing I did here, is I made these boxes the enemies in the game.  Once a box has been sliced twice, it kills this enemy and makes it inactive on the board.  If the player comes in contact with one of these and the enemy is still active, it will kill the player.  However, making these boxes enemies posed a new challenge.  Once the I attached the procedural mesh as a child component to a static mesh, which was a child component of a character actor, this no longer worked.  I wanted so badly for the enemies to have AI and to chase the player when they got close.  So, in order to work around this, I realized through debugging, that the collisions were not triggering correctly.  I got collision working, but even though slice function was being called, the box remained in one piece.  I eventually came up with a solution to make the character class it's own actor and then I made a sliceable mesh actor a child of the character actor.  This made the parent move the slicable box when the AI chased the player.  It took a very long time to discover this and I would like to continue researching this in the future.  I would like to eventually dismember character AI with more complex models than cubes.  The slice blueprint is below.

This was a very cool project where I got to flex my muscles as a coder to achieve some designs that interested me.  Also, I got to code in C++ and have those classes communicate with blueprints in unreal to keep track of my ammo which is seen on a widget at the bottom of the screen in the gifs above.  I also got to integrate Wwise into this project for some dynamic audio with the enemy chase moments.  Wwise is such a cool program to work with.  This project was a ton of fun to pull off and show to my peers.  That's all for today.  Thanks for reading!  More cool things to come!

Tuesday, January 31, 2017

Motion Capture

This week we finally had our first MoCap session for our Capstone The Draft.  This was really cool and we had some great talent to work with.  I was in charge of the session and took the actors through the motions that we wanted.  We were in the studio for 4 hours and took over 100 shots of about 45 animations.  We made sure we were very thorough and recorded 3 takes of each animation.  This will make the battle animations look really realistic.  Our artists worked hard and we already have 3 of the animations in engine to display with this coming Capstone Presentation tomorrow.  Here are the animations!

That's all for now.  More great things to come!

Saturday, January 21, 2017

Template classes

Today, I finished my first template class in C++.  Our requirement for an assignment was to write a list class to be used in future projects.  This was a big challenge and something new to me.  In the end, what helped out the most, was first creating the class for a primitive type int and then creating a template that accepted any type.  This project was also my first time using unit tests in C++.  I had only ever performed JUnit tests in Java so I was unaware of where to begin.  It turned out to surprisingly be easier than I thought.  However, one must write a ton of code to make sure all of the bases are covered when unit testing.  It was cool to get back into testing since it has been a while.  More great things to come!


This past weekend, I spent several hours at a booth promoting FIEA at Otronicon.  This was a very cool science convention that displayed the latest technologies such as virtual reality.  My game VRChaeology was selected to be displayed at the convention.  I was there to run the VR station which ran for the rest of the weekend.  My team spent time over break to prepare our game for the convention at the request from the faculty.  The game was a great hit!  We had people lined up for hours to play.  What was especially rewarding was seeing tiny children have their first VR experience and having a fun time.  Many were 6 or younger.  That was simply incredible.  Seeing your hard work pay off by having people enjoy your game is simply indescribable.  This was a great playtest session with the public and I recorded a great amount of data on how the game could be made easier and more intuitive.  My team plans to further improve VRChaeology for future conventions and perhaps an Steam release.  More to come!

Wwise education

Wow, been a while since I posted here.  I Came back to school a couple weeks ago and things have been ramping up for capstone and other projects.  My break was very successful.  I got to investigate the amazing program known as Wwise.  It's industry standard for creating dynamic sounds for games.  Even the amazingly popular game Overwatch uses it.  I spent the break learning how to perform many cool sound actions in my future games.  This will be extremely useful in my capstone game and the team was very grateful for it.  Yesterday I even held a training session for students at FIEA on how to create a few events in Wwise and then use them in the Unreal Engine.  The class went over very well and many teams are using Wwise inside of their own capstone projects.  Lots of cool stuff to come!