The immersive Lab – Log Four

Technical challenges so far..

When working with Google cardboard you allways need to take on consideration the raw processing power of mobile devices, and be wise on how to distribute tasks, in this Blog I will be discussing by sections what challenges I faced this past two week..

  • Main Screen:
I added an electrophoresis gel result as the sphere material, it looks great!

The Main screen was  not a challenge, but it was an interesting start point to understand the Google cardboard Hyerarchy when it comes to UI, for example the Google cardboard gaze input has two ways of interacting with game objects, one dternines if it is interacting with an graphic element (A button), the other uses the raycaster to check if the object you are interacting with is an interactable 3D model.

  • Still Images section:

This slideshow requires JavaScript.

At first in this section my approach was to create a scene for sphere and change scenes by clicking the menu, however this makes you have to wait longer to jump for one scene to the next and I wasnt feeling confortable about having to wait for long when loading it in the mobile.

My seccond approach did not work better, this time I decided to create 6 bubbles and swith cameras… This was a mess, loading times were hughe and it was harder to organize all my items and make sense of the scene, Then I tried to change the material everytime but the shader that change the vertex normals of my spheres did not accepted a change in the color of the material.. (or something like that.. I am still confussed).

So finally I recurred to the idea of creating the spheres, by then having the camera to change position with a script and discovered that I did not work before because before moving the camera of thegoogle cardboard you needed to place it in a empty object so it works like a tripod (Yeah!… still confussed about that too… Sigh!)

  • The Video section:


The video had to be stitched in the propietary software provided by Samsung and the software was a bit simple so I had to take the video somewhere else (Unity) to reduce the resolution and do other chnages here and there. I took me like a week to understand why the video was playing in my pc but not in my mobile and the answer was that the movie texture mesh doesnt get along with google cardboard, so you need to make some changes before it runs properly.


The activity section:

ANd finally this last week has been all about organizing assets for the animated section, so I can make them have the least polygon count possible so they can be used in a google cardboard device.

This slideshow requires JavaScript.

First I learned how to construct the bones, build controlers to rig  and skin a hand in 3ds max, and learned techniques to take 3D models I found online and clean them with different tools for a better performance, next week we will tak about the animation.

The immersive Lab – Log Three

Early design choices..

In this log I would like to take the opportunity to talk about the initial design choices and show some screens and comment on them.

The first environment I tested was the cube, it has a room cube, baked lights a Player element (w/ Google virtual reality interactive elements for Cardboard and Daydream ) , as well as 4 interact-able game objects with their respective canvas and UI elements (1) “Meet us” – With a movie logo, this one has access to the 360 movies, (2) “see the lab” this one has a access to the 360 lab tour, (3) Activity – With a Thunder Logo – this one has access to the 3D interactive animations.

My general feeling was that, even tho this was a nice environment felt a bit different to what I had envisioned for this project.

This slideshow requires JavaScript.

I tough probably a sphere would be more pleasant to work with, so I resourced to a sphere and used an shared that would map the material at the inside of the game object sphere, rather than at the outside, and added a material relevant to the theme, kept and modified the “Player” object from Google VR as well as the interactive SDK elements, it was easier that remaking everything from scratch.


I was not happy with the color choices, so I tried again. This time I selected as a background an electrophoresis gel  result that I found online under creative commons, I think this results when colored are quite kinetic and beautiful. After that I proceeded to collect colour samples and a used the same colors for the UI, added an ambient tune, a rotating script and Voila!, the sphere rotate on the Y axis, I think it looks great!.

This slideshow requires JavaScript.

As well I stitched some 360 pictures, and added to the the center of my sphere and added a “Go back” to main menu UI, I think the images look great at the inside of the sphere. I had some issues with the video, and the orientation of the images but I will be talking about them in the next blog thank you for reading…


Astronomy is Awesome! (2017)

3Ds Max Modelling and animation For Unity

I created this 3D models to as my first 3ds max models and animation, Lego’s are awesome and I have had always loved them, so I was trying to get a bit of a surreal  and find an imaginative way to say you can go to the stars and excite people about sciences!, it was really fun to model and I was really delighted to find this new skills I have learned. Please watch the video of the full animation below!


“The Bittersweet” – Log Fourteeen

This is the poster presented at the “2nd Virtual Social Inetraction workshop” omn the 12th – 13th, july 2016.

On regards of the idea of bringing a life perspective like the one of an autis- tic person. and placing (a non autistic) person as a protagonist in a IVR medium. using hand tracking, music and a framework of a hidden illness, has stimulated the imagination and emotions of the user making this evident in the written feedback.

A situation that is both challenging and in this case full of beauty, have been to our understanding a very welcome experience by everyone who has tested the IVR installation. and influenced by this we could say that the product we have created fulfills the original requirement.

On regards of the specifications for navigation using the leap motion it have been somehow positive but we know there is still some improvements to do. Probably the software should be upgraded to a new version of Leap motion and instead of V2 (The current core assets we are using) we could upgrade to ’Orion’ and experiment a bit more for a swifter motion.

Anyhow we consider that other features like UX using the Oculus rift imple- mentation, scenes mechanics and effects work well, making this product to function correctly according to initial specifications in the design.

Taking all this elements in account we conclude that this virtual relaity installation has been a success on bringing another perception to neurotypical people and opening a window into what can be done in IVR (immersive Virtual Reality).

“The Bittersweet” – Log Twelve

Learning to deal with challenges… and not to die trying.

User tests is one of the primordial boxes to tick when it comes to software develop. And dealing with technical difficulties could be any developer worst nightmare. so when you are rushing to have a proper user test done because you do need to study for an exam what could go wrong?

Last week with a week in advanced did set up my user tests and asked for a computer that could hold OCULUS SDK 2 + LEAP MOTION  to run tests, however not only was given a computer were I had to install OCULUS RUNTIME, LEAP MOTION V2 and update UNITY plus wait for someone with admin rights to update the graphic cards, this way loosing precious time because apparently after installing Windows10. 

It did not occurred to no one that that machine should have had been made “OCULUS READY” but in top of that I was given a machine that have multiple graphic cards and it’s an issue that Oculus would not work with a machine that has multiple graphic cards unless it is mapped specifically for the rift according to my knowledge. 

I even tried running an old runtime in my mac and see if I could get the dk2 to work as some youtube videos said.. But nothing.

In times like this when you are being tested by life, the best way to get out of it is to smile.. and so I did. I smiled so hard that it was weird.. but by doing this I found great kindness within myself, the kindness that you feel when you see an overworked man that is sick of apologising, the same kindness my tester subject showed to me when they turned up and couldn’t show them my software and they were excited because it was VR!

And life goes on.. and see you guys next week.

“The Bittersweet” – Log Ten


This blog post it’s dedicated only to portray the testing of what I have done so far, yesterday tested the transition from MAC to PC in UNITY , transferred my scenes from one place to another as my current MAC is not able to handle graphics properly without giving me errors, so took the assets and tested it for the LEAP MOTION and OCULUS.

I was a bit scared that because the core assets of LEAP motion on the store are deprecated I was not sure that it would work on the next machine but after some tweaking it did.

Another week  and two more weeks to go HOORAY!

The Music Therapy Gloves– Log Three

I have tried a couple of times to make the Lilypad Arduino board work, but somehow there are a couple of stuff  that I am not getting my head around just yet so I have asked a very smart lady to help me out (Miss Perla Thanks!!!) and thank you Phoenix for giving me the idea of the conductive fabric and inspiration to build up my gloves, amazing computing ladies xoxo.

well at the beginning of the week I wanted to create a set of gloves that using touch sensors could help people do therapy to do their appropriate hand gestures to grab stuff, as this separate us from other animals.

Today I this I have decide to work in this prototype.


“The Bittersweet” – Log Nine

In this Blog I would like to speak about my progress so far related to the final project in general and what tools I am using  to create  this 3 scenes. for objectivenes I will divide this in parts.


As explained in Log TWO, my focus interest in this narrative is to talk about this 3 elements commonly found within people within the autism spectrum .


  • Difficulty with social communication
  • Difficulty with social interaction
  • Difficulty with social imagination


And so the navigation methods found in every sigle scene are tighly related to the story I would like to tell and I can confirm that  I am using the Leap motion gesture based input to summerge the user within this story but as further the story goes the only gesture he is given to open doors and continue the journey will gradually from scene to scene cease to be.

Scene 1:  I am using nested animations and gesture based recognition to allow the user to continue. in this scene the user have full control but it’s forced to move only forward.

Scene 2: The user  is allowed to navigate the scene by OCULUS RIFT gazing and can use his hands to open the door.

Screen Shot 2016-04-13 at 09.40.24.pngScreen Shot 2016-04-13 at 09.38.49.pngScreen Shot 2016-04-13 at 09.39.35.png

Scene 3: In this scene the player its in a rollercoaster where basically cannot control what happens.

Screen Shot 2016-04-14 at 20.24.07.pngScreen Shot 2016-04-14 at 20.23.37.png

“The Bittersweet” – Log Eight

In this Blog I would like to talk about the Programs I am using to create or tweak some of  the assets I am using.

1 ) Crazy Bump /

This great free to use and open-source program does the heavy lifting on creating reslistic surfaces making your scenes more interesting, This is what I am basing myself on to create my textures.


2) ClipGrab /

One of the challenges I first faced was that I need to have animated movies in the television and in some walls, unfortunately that feature doesn’t exist in the free version of UNITY so to create animated textures and there are in unity a couple of ways you could go on about it, I am using (a) “CLipgrab” to download a movie from youtube and then (b) break the movie into frames in After effects (c) To then use the Automator to name then by number and put them as a texture to create the animation, nice!

3) Photoshop and Illustrator (Adobe) :

This two very versatile pieces of softaware have helped me on constructing the storyboard and to tweak some instances of the assets.

till next week.