RealityKit, Reality Composer & ARKit 3

Posted by:
Lizy Lal

July 8, 2019

At their 2019 World Wide Developer Conference (WWDC) Apple announced an exciting new 3D framework and tools for building and integrating augmented reality experiences RealityKit, Reality Composer and ARKit 3.

RealityKit, Reality Composer and ARKit 3 are available in the Beta version of IOS 13 and macOS Catalina at the time of writing this article and are planned to be released in September 2019. 

This announcement is a further development to the AR Quick Look and USDZ file format support which Apple brought in with IOS 12 last year. 

RealityComposer Image

What is RealityKit?

RealityKit is a brand new Swift API framework bundled with Xcode 11 for creating the next level of mobile AR experiences. It is built from the ground up using the power of ARKit 3 and allows you to augment the real world with virtual 3D objects, scenes, animations, audio files and include events allowing user-driven interactions. This is a really exciting development allowing AR experiences to be easily created, interactive and more engaging.

Exciting new features:

  • Physics simulations portray complex interactions between the virtual objects and the real world they are integrated into. RealityKit provides a collision detection system that supports several different proxy shapes and simulates rigid body dynamics.
  • The spatial audio capability within RealityKit understands the 3D space and dynamically plays audio content as the objects are interacted with. It makes far away virtual objects sound far away and vice-versa which gives the impression the object is in the real word! 
  • Networking support is built in enabling entire scenes to be synchronised across multiple devices including a shared representation of real world data. This enables multiple people to engage and interact with the AR experience at the same time which is particularly useful in an educational setting.
  • RealityKit uses an entity-component system to represent object data.
    • The Anchor entity is often the root entity of the the experience and specifies what object in the real world you would like to anchor your AR content to e.g. an image or an object. It could also anchor to a body part for example which opens up opportunities to teach medical or nursing students about injuries and disease represented on real life people.
    • The Model entity is the workhorse of the experience and specifies what content you want displayed in your experience. Entities are comprised of components. Components describe the behaviours and data that can be added to individual entities and can be re-used between all objects in the scene. The Model entity comprises of Model, Physics and Collision components. You can load a USDZ or Reality file directly into a Model entity and add the the entity as a child of the Anchor entity.
      • Meshes define the geommetric structure of the Model entity
      • Materials define how a model should look. They provide the look and feel of the object and how it interacts with the lighting around it. RealityKit uses the light from the real world to simulate what the object will look like – this is called physical-based rendering. The OcclusionMaterial is a new feature whereby virtual objects move in front of or behind real world objects.

RealityKit Entity-Components

 

What is Reality Composer?

Reality Composer is a tool that enables developers to build and create interactive 3D content for AR. It allows you to add behaviour into your scene and add simple interactions that help bring the AR experience to life.  

When building a new Reality Composer project, the scene layout and content are pre-configured with basic scene layout and default content which makes it simple for beginners to get started building an AR experience. Virtual objects can be added to the scene either from the Reality Composer library of objects or from an externally created USDZ file, animated or can be made interactive by adding behaviour actions on tap, move and other triggers. It also allows you to easily preview the content in AR while you build the project. 

Reality Composer is available on macOS Catalina in Xcode 11 and as an app in IOS 13 on the iPhone and iPad. When building an app via Xcode the Reality project is compiled and optimised into a Reality file. Code generation happens automatically generating a Swift API object, this enables adding further coding and behaviour to the content that was created by Reality Composer. A Reality file contains all data required for rendering and simulating the AR experience. It is optimised for use by RealityKit and contains all the relevant scenes, entities and components it needs to render the AR experience. A Reality File can be referenced in an application or via AR Quick Look using ARKit 3.   

 

What are the new features in ARKit 3?

ARKit 3 provides an innovative foundation for RealityKit. With all the changes in ARKit 3 augmented reality experiences will look better and feel more natural. New features include people occlusion, motion capture, collaborative sessions, simultaneous use of the front and back cameras, tracking multiple faces and more. 

  • Using computer vision, ARKit 3 understands the position of people in the scene. Knowing where a person is allows the system to correctly situate virtual objects with regard to real people in the scene, rendering those objects in front of or behind the person depending upon which is closer to the camera. This enables a new feature called people occlusion – effectively allowing the person in the real world to look immersed in the scene. In prior versions of ARKit, virtual objects would always show ‘on top’ of anyone in the scene. This feature uses machine learning to segment the scene, identifying people and objects and the depth of field for each object. It then reassembles the scene with each object placed and occluded by each other depending on their position. It runs on the Apple neural engine provided by the A12 Bionic chip and is therefore only available on the latest iPhone X and iPad Pro models. This new feature creates exciting prospects for having fun, engaging, immersive AR experiences. 

People Occlusion example

  • Using the new real time motion capture feature you can now track body movement and convert that movement into the animation of a 2D or 3D avatar in the scene. It uses a new type of anchor entity called ARBodyAnchor which contains a 3-D skeleton representation and can be easily created and added to a scene in Reality Composer. 
  • World tracking can be done with the back facing camera whilst simultaneously face tracking with the TrueDepth front camera. This means your face can now interact with the scene being viewed by the back camera. Up to three faces can be tracked at any one time. This also uses machine learning on the A12 chip so is only available on iPhone X and iPad Pro devices.
  • ARkit 3 enables collaborative sessions so AR experiences can be shared across multiple devices.
  • It is so clever matching camera motion blur and camera graininess to the virtual objects so they look and behave as if they were really being viewed by a physical camera in the real world.

References:

https://developer.apple.com/augmented-reality/

https://developer.apple.com/documentation/realitykit


Post navigation

Join the conversation

Your email address will not be published. Required fields are marked *

back to top