At this week’s big Apple Event, the tech company dropped a variety of new products, including new iMacs, an iPad Pro …and a purple iPhone. One thing you won’t find in the announcements, however, is any mention of ARKit, the company’s augmented reality platform for iOS.

Does that mean Apple didn’t make any spatial computing announcements? No—you just have to know where to look.

Here’s what you need to know from the event.

The Precision Finding feature enabled by AirTags

AirTags mean spatial information

During the event, Apple featured a new product called AirTags. These are circle-shaped beacons that you can attach to an object like a key chain, an iPhone, a camera bag, or anything else that you’d like to track in the Find My app. (Except for children or pets, apparently.)

At first glance, they seem like nothing more than a beacon for finding your stuff. But they have important implications for spatial computing.

Why is that? The tags include a U1 chip, which enables them to communicate and share spatial information with any other Apple device that also has a U1 chip.

Here’s how it works: You can open the Find My app on your latest-gen iPhone, and then access the Precision Finding feature. The device will use the U1 chip to locate the AirTag, and then tell you the direction of the lost item, as well as its distance from your current location.

In short, the AirTag offers a precise way to locate items in space. This is spatial computing at its most practical, and an important building block for future AR experiences and products.

Extra credit: How does a U1 chip work?

U1 chips use a technology called ultra-wideband to enable devices to “talk” to each other. This is a short-range wireless communication protocol that uses high-frequencies to provide spatial and directional data. It also relies on time of flight (the principle that makes lidar possible) to determine the distance between objects. Crucially, ultra-wideband is much more precise at locating objects in space than GPS or bluetooth.

Apple is not the only company using ultra-wideband. Samsung is including the technology in its devices, too, including the Galaxy Note20 Ultra 5G.

The new iPad Pro

A more powerful iPad Pro means better spatial computing

If you look closer at the new iPad Pro, you’ll see Apple setting more groundwork for AR and spatial computing. Among its numerous upgrades, the device features a lidar sensor, and a new 12MP “Ultra Wide” camera.

These sensors improve the spatial awareness of the device considerably. The lidar adds depth-sensing capabilities. And the wider camera makes it possible for the device “see” more in its field of view. That means it can better track your position in your environment, a necessary step for better AR experiences.

Besides the camera, the device includes Apple’s much-vaunted M1 chip, which increases the computing power significantly. The iPad Pro will be faster to process spatial information about the real world gathered from its array of sensors. It will also be more capable of producing AR visuals as needed. In other words, an improvement all around.

Lidar, lidar, lidar!

Apple may not have been very obvious about their AR technology at the event, but the company did show off a number of new AR experiences developed using the iPad’s new lidar sensor.

To see more, check out the Twitter thread below: