Google has announced the release of its new SDK ARCore which was made available for download on 28 August 2017.
ARCore brings AR capabilities to existing and future Android phones. The release marks a subtle shift in the Google’s AR strategy, which formally began with the launch of Project Tango in 2014.
While Tango worked by loading user’s smartphones up with custom sensors, ARCore uses software to add AR functionality in a similar way to Apple’s ARKit (even down to their similar names).
The release is not considered a replacement to Tango, but rather Google is allowing the two distinct ways of bringing AR to the mainstream to compete against each other. But while Tango was limited to the amount of devices it could be utilised on and had specific hardware requirements, ARCore is designed to bring that functionality to as many devices as possible.
You can watch a video here.
How does it work?
In a blog post, vice president of Android Engineering Dave Burke outlines how ARCore works. The software is underpinned by Java/OpenGL, Unity and Unreal and has three main areas of focus. Firstly, the phone’s camera is used for motion tracking. The camera picks up on feature points within a space and IMU sensor data, and ARCore then determines the position and orientation of the phone as it moves through the environment. This ensures virtual objects remain accurately placed.
The system is also capable of detecting horizontal surfaces so that objects deliberately placed on the floor or a surface act in a way that is consistent with their environment.
The third are of focus is ‘light estimation’, where the system observes the ambient light of the environment and allows developers to light virtual objects in the same way. This will increase the realistic appearance of virtual objects.
The release currently only supports two devices, the Pixel and Samsung S8, but Burke states that Google is targeting 100 million devices and is working with Samsung, Huawei, LG, ASUS to bring ARCore to other devices in the near future.