Thanks to all the sci-fi movies from the 80s and the 90s, we all knew that sooner or later AR and VR technologies will emerge to take up a sizeable chunk of the digital world. The films, however, did not prepare us for how quickly these technologies will pick up speed and go on to become a well-defined and progressive product niche.
Unlike VR where the user is entirely engulfed in the virtual experience, the AR experience seamlessly blends the physical world with digital input and alters our perception of the surrounding environment. In this sense, one doesn’t lose touch with the real world when interacting with AR tech, but rather receives an enriched experience of it.
What is Unity?
One of the reasons for the vast and prompt expansion of the industry is the abundance of AR/VR content creating apps, which allow designers and programmers to play around with their creativity.
Unity is the most popular AR/VR development platform on the market and has enabled companies such as HTC, Samsung, Apple and Microsoft to make huge strides in the field in the span of just a few years.
The platform is loaded with engines and features that make the conception of impressive VR/AR experiences easy and accessible for creators from around the globe. Given our innate love for design and digital products, we at MetaStudio are really keen on applying our expertise in this field and Unity is certainly one of our main partners in this effort.
How do you build in Unity?
Rendering of objects in Unity is done with materials, shaders and textures.
Materials provide definitions of how a surface should be rendered e.g. what textures, tiling, colour tints, etc. are to be used.
Shaders are small scripts that contain calculations and algorithms for determining the colour of each pixel rendered, based on the lighting input and the material specifics. Unity supports three different types of shaders - surface, fragment and vertex shaders.
You can code multiple SubShader sections and Unity will try to execute them in order, until it finds one that is compatible with your GPU. This comes in handy when coding for different platforms since you can fit different versions of the same shader in a single file.
Unity comes with a built-in Standard Shader, which can be customized extensively and is used to render real-world objects, such as stone, wood, glass, plastic and metal. However, a custom shader may be a better choice, in cases when you need to render a more detailed environment, such as liquids, foliage, refractive glass, particle effects, etc.
It looks real, but it’s 3D
Making thorough use of Unity’s diverse features, we created an interactive AR try-on App concept for mobile devices. The application gives the user the ability to try a specific watch model on his in an incredibly realistic and dynamic way.
The app allows you to rotate your hand and view of the device, so that you can get an immersive life-like picture of how exactly the timepiece would look on your wrist. This means that you can try on a watch and make sure it’s the right one for you without having to go to a physical shop, which is a very valuable feature for online shoppers.
What is more, you can customize the watch materials and looks directly from the app and see in real-time how those changes match your own style and vision.
How did we do it?
We tailored a custom shader, instead of relying on the Standard one, and deployed a handful of technical tweaks. We aimed to make both reality and virtuality communicate with each other and be presented to the user as a whole package.
The custom shader adds extra realism to the materials of the 3D watch model, while preserving all the common Unity physical based rendering (PBR) characteristics, except for the reflections from the surrounding area.
Our custom shader uses all default PBR parameters and makes use of custom computed reflections. The input of the front and back cameras of a user’s device come as simple textures, but the output of the reflections is a mixture of both images.
The end result is to see a reflection of yourself shot with the front camera once at startup and live video stream as an environment from the back camera where each frame reflects onto the watch’s surface. It works just as neat as it sounds.
We didn’t use the default Unity runtime environment modification methods, as they are too slow for most consumer devices out there. We utilized the spherical reflection mapping algorithm and a camera (eye) space to calculate the reflection vector. Unity’s platform facilitates such mixtures by working with both vertex and surface shader functions.
For the cylindrical image tracking and placing the object firmly on top of the tracked wristband, we used the Vuforia AR engine, which is also part of Unity.