M8 Competition Launch Event: Augmented Reality Interactions (1/3)
PART 1 OF 3: Wheel Animation
In the wake of the BMW Group + QUT Design Academy launch event of the M8 Competition Coupe, I wanted to dedicate some time to highlighting the different interactive experiences our team developed to showcase the technologies we work with. The M8 Competition Coupe functions as a research tool and as BMW’s leading supercar, the product sports the very best of the company's design language. Our team capitalised on the opportunity to accentuate the vehicle's design features and selected the wheels, exterior surfacing and engine. This blog post will focus on the production work surrounding the M8 Wheel’s AR installation by unveiling the modelling process behind the asset, the texturing workflow, animation and finally highlighting the tools used to bring the wheel to life.
Reference
Every good asset begins from a database of references; we began by scouring the internet for different images of both the unique BMW M8 wheel design as well as the brake caliper. In addition to the photos, we used the Academy’s M8 vehicle as a reference for scale as well as texture details. Spearheaded by my colleague Tim, the first stage involved using Blender as a poly modelling platform to ensure the wheel distributed light along the required contours, and the asset size remained as optimised as possible. When both the wheel and the brake caliper was finished, they were combined into a singular assembly and moved to the next stage of delivery.
Texturing
Our workflows diverged after the modelling phase was complete. One variation of our Asset was taken and each part was individually textured for 2D rendering and animation, while a second model (a complete assembly) was textured using a singular 2K image set for AR and VR. We took this approach to limit the amount of data needed to visualise the asset so that the installation would run smoothly on a mobile device. The image below shows a preliminary texture setup from an early point in the texturing phase. More information on our Substance Painter workflow can be found here: Substance Painter Decal Projection — BMW Group + QUT Design Academy
Animation
The next phase is one of my favourites. Creating storyboards and planning out animations to catch viewers off guard or start a conversation was the main objective behind the motion design. To capture this sense of awe, cylindrical rings with panels were added to establish a sci-fi aesthetic. The wheel, in its resting state, assumes a reserved visual stance to blend in with the realistic model (as an overlay), Later, it transforms to surprise viewers when it is brought to life as an ironman-Esq floating model. In doing so, we tried to celebrate the application and possibilities of Augmented Reality. The cylindrical rings are also used to add visual flair, guide the exploded motion, and provide focused dynamic lighting to the more obscure areas of the asset.
You’ll notice that the animation opens in a 3 stage motion. We achieved this through the use of three empties, each one parented to the last. Parent-child relationships function similarly to constraints. Children follow their parented objects but are free to move around without influencing a parent. We used this constraint feature to separate each of the three motion phases into three major parent groups and animated them individually (almost like 3 acts) to create a contrast between each phase of the movement.
Vuforia, Unity and AR
The last phase of development was to establish a platform for viewing the animated assembly on a mobile platform. We chose to work with Vuforia and Android due to the open-source nature of the kit and Unity’s access to microlevel animation, material and lighting controls. After the initial export trials had established a robust workflow, we created reference tracking images using the image generator - JS Placement. JS Placement is a great resource for creating Sci-fi height maps and generally adding detail to surfaces. Its algorithm allowed us to create 3 distinct images while still maintaining a coherent overarching visual style. Another important aspect of this phase was aligning the CGI model with the physical car - months before our Beta Lab space was furnished. We accomplished this by using a circular ottoman in the office as a reference for our wheel, and setting markers around the room to symbolise the dimensions of the vehicle and their relationship to the various AR installations.
Below are some BTS shots of our team testing the Wheel AR installation before the Launch event. This project was a great learning experience and I'm looking forward to applying our learning outcomes to future projects. If you’ve reached the end of this article I hope you’ve found new insight into our workflows - my wish is that our projects inspire new ideas. Keep an eye out for parts 2 and 3 breaking down the other AR experiences.
Thanks for reading,
Epifanio Pereira
Junior Design Associate