Recently we did a Techrally day at one of our clients, Intergamma. The client provided a couple of subjects of their interest, from voice search to automated classification. With a team of 4, we decided to build an augmented reality mobile app which shows DIY assembly instructions to help a customer 'on the spot'. Did we succeed? Read on...
The setupAfter chatting with the Intergamma product owner, we came up with the following user journey:
- A customer buys a product, for instance a wall power socket
- She opens the proof-of-concept app when she wants to know more about assembling the product
- In the vicinity of the old socket, she can start the augmented projection of assembly instructions
- Several animated steps show the teardown of the old socket and assembly of the new one
- The customer succeeds in performing the steps thanks to the augmented instructions
- Easy importing and manipulation of 3D objects and animations
- Tracking a location on a wall is a necessity
- The app needs 2D overlays for textual explanation
- Cross-platform development is a plus (iOS and Android from one code base)
Development progressTwo team members got busy with drawing and animating 3D wall sockets in Blender. One of us got a little distracted and looked at the Vuforia library as an alternative AR library. Needless to say it delivers it promise. But we finally settled with ARCore for the final proof-of-concept. It's good to know there are more libraries out there that offer good cross-platform AR support. As first-time users of the Unity development environment, we were a bit overwhelmed by all the options of the tool. You start of with a Project that contains your app's assets. Assets can be 'prefabs' (3D models with texture and animations), 2D images, C# scripts, configurations and so on. A scene is a hierarchy of instances of those scripts, prefabs and other stuff like overlays. The scripts then can refer to these instances and interact with the scene hierarchy. For instance, a script can move a prefab around in 3D space or show an overlay. Of course we skipped reading the docs and we found ourselves stuck. You need to make 'magic' connections between configuration fields and instances. We only heard later that you need to do this by dragging and dropping them. Apart from the sometimes unintuitive interface choices, the Unity environment gave us a head start. Installing the app on a connected Android phone went without pain. Creating a couple of UI texts to display was plain easy. Writing small update scripts in the included C# editor was OK, autocomplete helped us out. The AR library comes with a Unity plugin that lets you create a database of augmented image markers. We first thought the plugin did not work as it kept 'analysing' an image. Then we found in the log view that the analysis failed due to low contrast and repetitiveness in the image. Changing the marker image solved that. [caption id="attachment_25978" align="aligncenter" width="164"] Our final marker image[/caption]
Demo timeAll pieces connected in time for the presentation deadline. Luckily the demo went smooth!
- (basic) augmented reality effects are in reach for your average developer thanks to the current AR libraries
- creating the assets for your app requires a lot of time and specialised expertise, don't underestimate the time spent on it
- by using a development environment such as provided with Unity, your project can get a kickstart