In this video I experiment with the spatial audio for VR in Unity, added sound to the clock, to the turning pages and the opening glass door.
When I was a little boy I used to be fascinated when the lip-synch was out of time in the movies and was consistently trying to duplicate it. I created a dis-temporal audio file in audacity and added it to the dinner animation. As well I added a tiny particle system to the radio to highlight the most powerful audio source of the scene.
I had so much fun doing this with one of my favorite works, Good bye for now “The Bittersweet”, Thank you for reading.
When working with Google cardboard you allways need to take on consideration the raw processing power of mobile devices, and be wise on how to distribute tasks, in this Blog I will be discussing by sections what challenges I faced this past two week..
The Main screen was not a challenge, but it was an interesting start point to understand the Google cardboard Hyerarchy when it comes to UI, for example the Google cardboard gaze input has two ways of interacting with game objects, one dternines if it is interacting with an graphic element (A button), the other uses the raycaster to check if the object you are interacting with is an interactable 3D model.
Still Images section:
At first in this section my approach was to create a scene for sphere and change scenes by clicking the menu, however this makes you have to wait longer to jump for one scene to the next and I wasnt feeling confortable about having to wait for long when loading it in the mobile.
My seccond approach did not work better, this time I decided to create 6 bubbles and swith cameras… This was a mess, loading times were hughe and it was harder to organize all my items and make sense of the scene, Then I tried to change the material everytime but the shader that change the vertex normals of my spheres did not accepted a change in the color of the material.. (or something like that.. I am still confussed).
So finally I recurred to the idea of creating the spheres, by then having the camera to change position with a script and discovered that I did not work before because before moving the camera of thegoogle cardboard you needed to place it in a empty object so it works like a tripod (Yeah!… still confussed about that too… Sigh!)
The Video section:
The video had to be stitched in the propietary software provided by Samsung and the software was a bit simple so I had to take the video somewhere else (Unity) to reduce the resolution and do other chnages here and there. I took me like a week to understand why the video was playing in my pc but not in my mobile and the answer was that the movie texture mesh doesnt get along with google cardboard, so you need to make some changes before it runs properly.
The activity section:
ANd finally this last week has been all about organizing assets for the animated section, so I can make them have the least polygon count possible so they can be used in a google cardboard device.
First I learned how to construct the bones, build controlers to rig and skin a hand in 3ds max, and learned techniques to take 3D models I found online and clean them with different tools for a better performance, next week we will tak about the animation.
In this log I would like to take the opportunity to talk about the initial design choices and show some screens and comment on them.
The first environment I tested was the cube, it has a room cube, baked lights a Player element (w/ Google virtual reality interactive elements for Cardboard and Daydream ) , as well as 4 interact-able game objects with their respective canvas and UI elements (1) “Meet us” – With a movie logo, this one has access to the 360 movies, (2) “see the lab” this one has a access to the 360 lab tour, (3) Activity – With a Thunder Logo – this one has access to the 3D interactive animations.
My general feeling was that, even tho this was a nice environment felt a bit different to what I had envisioned for this project.
I tough probably a sphere would be more pleasant to work with, so I resourced to a sphere and used an shared that would map the material at the inside of the game object sphere, rather than at the outside, and added a material relevant to the theme, kept and modified the “Player” object from Google VR as well as the interactive SDK elements, it was easier that remaking everything from scratch.
I was not happy with the color choices, so I tried again. This time I selected as a background an electrophoresis gel result that I found online under creative commons, I think this results when colored are quite kinetic and beautiful. After that I proceeded to collect colour samples and a used the same colors for the UI, added an ambient tune, a rotating script and Voila!, the sphere rotate on the Y axis, I think it looks great!.
As well I stitched some 360 pictures, and added to the the center of my sphere and added a “Go back” to main menu UI, I think the images look great at the inside of the sphere. I had some issues with the video, and the orientation of the images but I will be talking about them in the next blog thank you for reading…
This week I had the pleasure to visit the teaching lab at the school of life sciences of Glasgow university, where I for the first time used the Samsung gear 360 camera to record the videos and 360 stills I will use to construct the application.
One of the biggest challenges I faced that day was to communicate the ideas I had for this shoot to get everyone in the same page, that is a real skill that only the experience can give, as directing a piece and being in charge of the equipment at the same time can be challenging, I prepared scripts and pseudo-scripts for the students with anticipation, it was loads of hard work, but I feel it did pay in the end.
I was quite lucky to have such a group of good talented students to help me put the vision I had together, as well as the lecturers who helped me on setting everything up (thank you ladies you are the best! ) , I will discuss in the next blog how I will implement this images on UNITY, thank you for reading.
Hey so this time all has worked within reason!, I finally decided to test on a sdk1+leap motion v2. walked around uni doing my testing. Did 21 tests and the users were quite happy. https://www.youtube.com/watch?v=GP89FuFbnIA
This week has been exciting, as part of my research in VR interaction went to VRUK, an event where for 2 complete days there is VR Demos, and talks. As well agreed a structure for the next week with my tutor.
I did try a couple of demos, in this picture tried oculus and the “Wizdish”, and the fantastic View-master by Mattel. I even told the guy from the coffee shop my name was “VR” so he would write it on my cup.