3 min read

Augmented Reality development on iOS vs Android

Augmented Reality development on iOS vs Android

It is one of the considerable battles of our era. Google versus Apple, Android versus iOS and now here's one more addition - ARCore versus ARKit. The competition for control of Augmented Reality, in which these two organizations race to pick up a higher ground.

ARKit vs ARCore

Before going into the in-depth comparison of both it is vital to understand the AR frameworks independently.

What is ARCore

Google's framework for building AR apps on Android is called ARCore. It delivers the ability to build Augmented Reality apps for devices running the Android OS and distribute them on the Play Store.

What sort of apps can be built with ARCore

The most common type of AR apps built with ARCore are games, Augmented Reality is a direct fit there.  However, multiple other kinds of applications can make use of AR as well, from shopping apps and geo-locations to social media and travel applications. Presently there aren’t several instances of Android apps designed with ARCore because the framework is relatively new.

What is ARKit

ARKit is Apple's framework for developing Augmented Reality apps for iOS. At WWDC 2018, Apple introduced a lot of changes and improvements to ARKit with its version 2.0

What apps can ARKit build

There are a lot of apps available on the App Store which use ARKit. One of the most famous AR apps available on the App Store was Pokemon Go. Another well-known app was ModiFace 3D, which makes you see how you would look with a new shade of hair color.

ARCore vs ARKit

Well, it's obvious that Google's ARCore is restricted to Android, whereas Apple's ARKit is confined to iOS devices. This implies both will coincide as the demand for Android and iOS mobiles coexist.

With above two billion mobile devices, Android is the biggest mobile platform on the planet. While Android devices have a bigger share of the overall industry, yet Apple has considerably more prominent control over their devices implying that there is little to isolate them here.

Apple doesn't have to depend on OEM's and does not have the issues of the cracked Android market and production network with various players. Notwithstanding, Google is working with the manufacturers of their phones to guarantee that majority of the recent mobile devices meet the criteria needed for these applications to work at the quality and execution levels that they suppose.

Google expresses that ARCore is based on three basics - motion tracking, environmental understanding, and light estimation. Furthermore, they are developing supplementary applications and services to facilitate the workload of engineers comprising of Blocks and Tilt.

Apple then has a true depth camera, scene comprehension, visual inertia odometry, and lighting estimation as well as high-performance rendering and hardware.

Still, it’s clear that with these two AR tools, Augmented Reality technology is shifting from the hype level into an age of practical use. Google and Apple both are taking it seriously, that possibly initiates the fuss for more AR discovery, innovation, and race going ahead.

ARKit

ARCore

Launch: September 2017 Launch: March 2018
Compatible devices: Any iOS devices using 

A9, A10 or A11 Bionic processors. This covers iPhone X, iPhone 8, 8 Plus, iPhone 6s Plus, iPhone 7 and iPhone 7 Plus, all iPad Pro models, and the 9.7-inch iPad launched in 2017

Compatible devices: 

select Android phones that run Android 7.0 or later

Key features:

1. Tracking: employs visual-inertial odometry (VOI) to build an agreement between virtual & real spaces. This connects motion sensing data with digital vision analysis of the surroundings

2. Rendering: Simple combination SpriteKit, SceneKit as Metal, with an additional help for in Unity and Unreal Engine

3. SceneUnderstanding & Lightning Estimation: Capability to detect real-world horizontal surfaces, like floors, walls, tables, etc. Can handle iPhone’s camera sensor to determine the total mass of light in an environment, then implement evaluation texture and shading to the virtual objects.

Key features: 

1.  Motion tracking: Using the mobile's camera to watch feature points and IMU sensor data, decides both the orientation and position of the device. Virtual articles remain precisely located.

2. Environmental understanding: Can identify horizontal surfaces utilizing similar feature points as used by it for motion tracking.

3. Light estimation: Perceives ambient light in the environment and light virtual objects in the forms that match their environment, showing them in considerably more sensible manner.

In a nutshell, the arrival of the two frameworks has come as a much-needed refresher for developers, as both accompany embedded features essential to their particular OS, facilitating the production of AR-based mobile applications that control the equipment on board Android or Apple devices.

Ask us for demos of AR-apps we have developed, or get yours developed today.

Challenges you can face in AR mobile app development

Challenges you can face in AR mobile app development

In the long run, Augmented Reality has made its way to pretty much every industry. Regardless of whether it's gaming, mobile application development,...

Read More
5 Innovations Shaping the Future of Augmented Reality

5 Innovations Shaping the Future of Augmented Reality

Augmented reality and virtual reality have now become the central focus of mobile innovation. The year 2017 marked announcements of some...

Read More