Apple’s iOS 11 is coming to compatible devices starting September 19th, which means that if you own a recent Apple iPhone or iPad, you may have an augmented reality-capable device in your hands before the new iPhone line even launches.
Apple’s big iPhone 8/8 Plus/X unveiling this year promised a bevy of information surrounding augmented reality, thanks to the release earlier this summer of ARKit. As a tool that lets developers make AR games and apps on what Apple says will amount to “hundred of millions of iPhones and iPads,” we had our hopes pretty high for a slew of app announcements.
While we only saw four AR apps revealed on stage demonstrating the phone’s AR capabilities, Apple has said in the past that they’re working with Pokemon GO creators Niantic, IKEA, and Lego to name a few to bring AR apps to the App Store. To that end, starting this month any iPhone, iPad or iPod that can upgrade to iOS 11 will be able to get in on the action, which the company says will let you do things like “redecorate your home, explore a city you’ve never visited, or even try on a new tattoo.”
Apple is advertising the new iPhone line as custom designed “for the ultimate augmented reality experience,” featuring specially calibrated cameras, a screen low on bezels, and the new A11 Bionic processor that drives the room and face-mapping power of the new devices. That may not be enough for many to take the $1000 iPhone X upgrade, but if you’re looking for the most capable AR-capable phone out there, you can bet it’s going to be an Apple product until other manufacturers get in the game.
Google also recently released an AR developer kit, ARCore, which aims to give similar AR abilities to “100 million [Android] devices at the end of preview.” Google is working with Samsung, Huawei, LG, ASUS and unnamed others to accomplish it, making AR the next battle ground for the competing brands.
Reports indicate the new iPhones on tap to be revealed next week by Apple could function as a best-in-class augmented reality device.
Apple earlier this year revealed its ARKit platform for developing augmented reality apps on hundreds of millions of its existing handheld devices. Since the release we’ve seen a series of impressive applications previewed on the Internet. Videos capture the immersive potential of the ARKit applications incredibly well even though they may be less impressive when viewed first-hand because they are delivered on a phone screen rather than presented on glasses and delivered directly to the eyes.
The technical difficulty of developing fashionable AR glasses people want to wear all day is extraordinary, and we’d be extremely shocked to see Apple release something along these lines in 2017. Instead, we expect the next iPhone to debut with new outward-facing cameras that improve upon the ARKit functionality Apple already debuted earlier this year, according to a report by Bloomberg.
The device is expected to also include a front-facing 3D camera that, while reports indicate could be used for payments or security, might also have AR applications. Now let me preface this by saying we’ve heard no solid reports Apple plans to include the following functionality in the new devices, but I certainly hope it comes to pass. Technology Apple acquired in 2015 could make FaceTime video calls much more fun. It is feasible this device could enable people to optionally have their entire conversation over video chat as animated cartoon characters, with all their expressions captured. In 2015, Apple acquired a company called Faceshift that allowed people to take their facial expressions and transfer those in real-time onto a myriad of cartoon characters.
Lastly, reports indicate Apple is paying Samsung a fortune for the OLED displays used in the new iPhone — contributing to the expected high cost of the device while paying the tech giant’s chief rival a major slice of the profit driving Apple’s extraordinary growth over the last decade. Apple is said to be sourcing the component exclusively from Samsung, which puts OLED screens in all of its phones compatible with its Gear VR headset. Samsung seems to have a lock on the market for these displays. Reports indicate Apple is considering an investment in LG to spur the company’s efforts so that it too can produce these components at the scale it requires for iPhone production. While Microsoft’s upcoming headsets are expected to use LCD screens with “impulse backlighting”, the vast majority of existing VR headsets have been reliant on Samsung’s OLED display technology to function. That said, we saw an excellent preview of a VR headset based on LG’s screens earlier this year. The takeaway here is that, with OLED screens finally powering the iPhone after years without, it might be possible to stick the new iPhone into a VR headset similar to the Gear VR and have a higher quality experience than has been possible with Apple devices in the past. Developers using ARKit have already tested inside-out tracking with earlier iPhones and found it to do a surprisingly solid job. With a higher quality display, developers might be able to prepare more enjoyable VR apps released on Apple’s App Store.
Over the past few weeks I’ve been steeping myself in the developer and investor community that is quickly sprouting up around ARKit.
There are a variety of reasons that people are excited about the possibilities but it’s safe to say that the number one positive that’s shared by everyone is the sheer scale of possible customers that will be able to experience augmented reality on day one of iOS 11. Hundreds of millions of potential users before the year is out is a potent pitch.
I’ve seen some very cool things from one- and two-person teams, and I’ve seen big corporate developers flex their muscle and get pumped about how capable AR can be.
At a round robin demo event yesterday with a bunch of developers of AR apps and features, I got a nice cross-section of looks at possible AR applications. Though all of them were essentially consumer focused, there was an encouraging breadth to their approaches and some interesting overall learnings that will be useful for developers and entrepreneurs looking to leverage ARKit on iOS.
Let me blast through some impressions first and then I’ll note a few things.
What it does: Allows you to place actual size replicas of IKEA sofas and armchairs into your house. 2,000 items will be available at launch.
How it works: You tap on a catalog that lets you search and select items. You tap once to have it hover over your floor, rotate with a finger and tap again to place. The colors and textures are accurately represented and these are fully re-worked 3D models from IKEA’s 3D scans used for its catalogs. It looks and works great, just as you’d expect. IKEA Leader of Digital Transformation Michael Valdsgaard says that it took them about 7 weeks or so, beginning slightly before Apple’s announcement of ARKit, to implement the mode. It will be exclusive to iOS for now because it’s the largest single target of AR capable devices. I asked Valdsgaard how long it took to get a first version up and running and he said just a couple of weeks. This has been a holy grail for furniture and home goods manufacturers and sales apps for what seems like forever, and it’s here.
Food Network In The Kitchen
What it does: Lets you place and decorate virtual desserts like cupcakes. Allows you to access the recipe for the base dessert.
How it works: You drop a dessert onto a surface and are provided with a bunch of options that let you decorate a cupcake. A couple of things about this demo: First, it worked just fine and was very cute. A little animated whale and some googley eyes topping a cupcake which you can then share is fine. However, it also demonstrates how some apps will be treating AR as a “fun extra” (the button is literally labeled “Fun”), rather than integral to the experience. This is to be expected in any first wave of a new technology, but examples out there like KabaQ show that there are other opportunities in food.
What it does: Allows you to place gifs in 3D space, share videos of them or even share the whole 3D scene in AR with friends who have the app. They can then add, remix and re-share new instances of the scene. As many people as you want can collaborate on the space.
How it works: You drop gifs into the world in the exact position you want them. A curated and trending mix of gifs that have transparency built into them is the default, but you can also flip it over to place any old Gif on the platform. Every scene gets a unique URL that can be remixed and added to by people that you share it with, effectively creating a shared gif space that can be ping-pinged around. The placement of gifs felt very logical and straightforward, but the ability to “paint” with the gifs and then share the scenes whole in a collaborative fashion was a pleasant surprise. One example that was impressive was leaving a pathway to a “message” that a friend could follow when you shared the scene to them. Ralph Bishop, GIPHY’s head of design, says that the app will be free like their other apps are but will have branded partners providing some content. GIPHY has something interesting going on here with a social AR experience. It’s early days but this seems promising.
What it does: It’s a game from Climax Studios that places a (scalable) 3D world full of crumbling ruins onto your tabletop that you help your character navigate through without any traditional controls.
How it works: You look through your device like a viewport and align the perspective of the various pathways to allow your character to progress. There are no on-screen controls at all, which is a veryinteresting trend. Climax CEO Simon Gardner says that translating the game into AR was attractive to the studio (which has been around for 30 years) was the potentially huge install base of ARKit. They’re able to target hundreds of millions of potential customers by implementing a new technology, which is not the typical scenario where you start at effectively zero. The experience was also highly mobile, requiring that you move around the scenes to complete them. Some AR experiences may very well be limited in their use or adoption because many people use phones in places where they are required to be stationary.
The Very Hungry Caterpillar AR
What it does: Translates the incredibly popular children’s book into AR.
How it works: The story unfolds by launching the app and simply pointing at objects in the scene. We saw just a small portion of the app that had apples being coaxed from a tree and the caterpillar scooching its way through them to grow larger. This was my favorite demo of the day, largely because it was cute, clever and just interactive enough for the age level it is targeting. It’s also another ‘zero controls’ example, which is wonderful for small children. Touch Press CEO Barry O’Neill says that they’ve seen some very interesting behavior from kids using the app including getting right down at eye level with the tiny caterpillar — which meant that they really had to up-res the textures and models to keep them looking great. Now that ARKit enables capturing any plane and remembering where objects are (even if you move 30-50 feet away and come back), storytelling in AR is finally moving beyond marker-based book enhancements. Any surface is a book and can tell a story.
The Walking Dead: Our World
What it does: It’s a location-aware shooter that has you turning in place to mow down zombies with various weaponry.
How it works: The scene I saw looked pretty solid, with high resolution zombies coming at you from all angles, forcing you to move and rotate to dodge and fend them off. You progress by “rescuing” survivors from the show which provide you with unique additional capabilities. Environmental enhancements like virtual “sewers” that walkers can crawl up out of give each scene a unique feel. It looked fast and smooth on a demo iPad. AMC and Next Games collaboratedon this title. There were some additional fun features like the ability to call up various poses on a survivor like Michonne and stand next to them to take a selfie — which felt super cool. The best kinds of IP-based games and apps will focus on unlocking these kinds of “bring that world into your world’ experiences” rather than cookie cutter gameplay.
Some interesting notes:
Every app had its own unique take on the ‘scanning’ process that allows ARKit to perform plane detection before it can begin placing objects into the world. Basically a few seconds where you’re encouraged to move the phone around a bit to find flat surfaces and record enough points of data to place and track objects. It’s not onerous and never took more than a few seconds, but it is something that users will have to be educated on. Ikea’s conversational interface prompted people to “scan” the room; The Walking Dead suggested that you “search for a survivor” and Food Network’s app went with a “move around!” badge. Everyone will have to think about how to prompt and encourage this behavior in a user to make ARKit work properly.
Aside from the apps that are about placing objects directly into the scene, there is a focus on little-to-no on-screen controls. For Arise, your perspective is the control, allowing you to get an alignment that worked to progress the character. There are no buttons or dials on the screen by design.
The Very Hungry Caterpillar’s control methodology was based on focus. The act of pointing at an object and leaving your gaze on it caused the story to progress and actions to be taken (knocking fruit out of a tree for the caterpillar to munch on or encouraging it to go to sleep on a stump). Most of the other apps relied on something as simple as a single tap for most actions. I think this control-free or control-light paradigm will be widespread. It will require some rethinking for many apps being translated.
Incredibly short, all things considered. Some of the apps I saw were created or translated into ARKit nearly whole sale within 7-10 weeks. For asset-heavy apps like games this will obviously be a tougher ramp, but not if you already have the assets. GIPHY World, for instance, places a bunch of curated gifs that look great floating in the world at your fingertips, but you can easily drop regular gifs in there from their millions of options.
Models that Touch Press used for its previous Caterpillar app had to be upscaled in terms of complexity and detail quite a bit because they fully expect children to experience them in distances as close as inches. IKEA also had to update its models and textures. But given that the onramp is measured in days or weeks instead of months, I’d expect to see a large number of apps supporting ARKit experiences at the launch of iOS 11 in September. And for a bunch to follow quickly.
A preview of the first wave of AR apps coming to iPhones September 7th, 2017leeyang
With passion for immersive technologies we aim to provide you with everything you need to start and grow as a developer. We also provide custom application development for businesses. Contact us via the form below and one of our staff will be in touch with you shortly.