Apple’s iOS 11 is coming to compatible devices starting September 19th, which means that if you own a recent Apple iPhone or iPad, you may have an augmented reality-capable device in your hands before the new iPhone line even launches.
Apple’s big iPhone 8/8 Plus/X unveiling this year promised a bevy of information surrounding augmented reality, thanks to the release earlier this summer of ARKit. As a tool that lets developers make AR games and apps on what Apple says will amount to “hundred of millions of iPhones and iPads,” we had our hopes pretty high for a slew of app announcements.
While we only saw four AR apps revealed on stage demonstrating the phone’s AR capabilities, Apple has said in the past that they’re working with Pokemon GO creators Niantic, IKEA, and Lego to name a few to bring AR apps to the App Store. To that end, starting this month any iPhone, iPad or iPod that can upgrade to iOS 11 will be able to get in on the action, which the company says will let you do things like “redecorate your home, explore a city you’ve never visited, or even try on a new tattoo.”
Apple is advertising the new iPhone line as custom designed “for the ultimate augmented reality experience,” featuring specially calibrated cameras, a screen low on bezels, and the new A11 Bionic processor that drives the room and face-mapping power of the new devices. That may not be enough for many to take the $1000 iPhone X upgrade, but if you’re looking for the most capable AR-capable phone out there, you can bet it’s going to be an Apple product until other manufacturers get in the game.
Google also recently released an AR developer kit, ARCore, which aims to give similar AR abilities to “100 million [Android] devices at the end of preview.” Google is working with Samsung, Huawei, LG, ASUS and unnamed others to accomplish it, making AR the next battle ground for the competing brands.
Reports indicate the new iPhones on tap to be revealed next week by Apple could function as a best-in-class augmented reality device.
Apple earlier this year revealed its ARKit platform for developing augmented reality apps on hundreds of millions of its existing handheld devices. Since the release we’ve seen a series of impressive applications previewed on the Internet. Videos capture the immersive potential of the ARKit applications incredibly well even though they may be less impressive when viewed first-hand because they are delivered on a phone screen rather than presented on glasses and delivered directly to the eyes.
The technical difficulty of developing fashionable AR glasses people want to wear all day is extraordinary, and we’d be extremely shocked to see Apple release something along these lines in 2017. Instead, we expect the next iPhone to debut with new outward-facing cameras that improve upon the ARKit functionality Apple already debuted earlier this year, according to a report by Bloomberg.
The device is expected to also include a front-facing 3D camera that, while reports indicate could be used for payments or security, might also have AR applications. Now let me preface this by saying we’ve heard no solid reports Apple plans to include the following functionality in the new devices, but I certainly hope it comes to pass. Technology Apple acquired in 2015 could make FaceTime video calls much more fun. It is feasible this device could enable people to optionally have their entire conversation over video chat as animated cartoon characters, with all their expressions captured. In 2015, Apple acquired a company called Faceshift that allowed people to take their facial expressions and transfer those in real-time onto a myriad of cartoon characters.
Lastly, reports indicate Apple is paying Samsung a fortune for the OLED displays used in the new iPhone — contributing to the expected high cost of the device while paying the tech giant’s chief rival a major slice of the profit driving Apple’s extraordinary growth over the last decade. Apple is said to be sourcing the component exclusively from Samsung, which puts OLED screens in all of its phones compatible with its Gear VR headset. Samsung seems to have a lock on the market for these displays. Reports indicate Apple is considering an investment in LG to spur the company’s efforts so that it too can produce these components at the scale it requires for iPhone production. While Microsoft’s upcoming headsets are expected to use LCD screens with “impulse backlighting”, the vast majority of existing VR headsets have been reliant on Samsung’s OLED display technology to function. That said, we saw an excellent preview of a VR headset based on LG’s screens earlier this year. The takeaway here is that, with OLED screens finally powering the iPhone after years without, it might be possible to stick the new iPhone into a VR headset similar to the Gear VR and have a higher quality experience than has been possible with Apple devices in the past. Developers using ARKit have already tested inside-out tracking with earlier iPhones and found it to do a surprisingly solid job. With a higher quality display, developers might be able to prepare more enjoyable VR apps released on Apple’s App Store.
This new Samsung tool is using VR to bring sight to the near-blind.
VR has the opportunity to improve so much of daily life by altering and improving reality. For those with vision impairment, seeing and reading are the ultimate improvements to reality—and VR may be the solution.
Relúmĭno is an app that pairs with the Samsung Gear VR to serve as a smart aid for these people, allowing them to engage with the world like never before.
By processing the outside world through a smartphone camera and altering the image, near-blind and other visually impaired people have the opportunity for better quality reading and television viewing experiences.
“Relúmĭno will be the life-changer for 240 million of the visually impaired people around the world and we promise a firm and continuing support,” said Jaiil Lee, Vice President and Head of Creativity & Innovation Center at Samsung Electronics, in an August statement.
Relúmĭno not only displays the image from the smartphone camera, it also allows the user to adjust the image—including color contrast and brightness, the size of the image and the screen color and brightness.
These adjustments also allow people with tunnel vision to see beyond their vision. Relúmĭno remaps unseen images and places them in visible parts of the eye within the VR experience. Though Relúmĭno won’t be able to help those with total blindness.
This isn’t the only application of its kind, Android has a similar app called Near Sighted VR that works in a similar fashion by using the Android’s camera to create a stereoscopic view or one split between your two eyes.
Samsung users with newer models of Galaxy smartphones, including the Galaxy S7, S7 Edge, S8 and S8+ can download Relúmĭno for free in the Oculus Store in both English and Korean.
Song Exploder, the popular music podcast that lets musicians explain every part of their songs, has gotten some help from Google to create a new app that throws you in the middle of song so you can experience it like never before. Called Inside Music, the WebVR app lets you turn individual pieces of a song on and off, giving you a little more insight into just how it’s made.
The project, which Song Exploder reiterates is “an experiment, not a Google product,” lets you select a song from the menu, presenting you with little orbs that represent parts of the song, or ‘stems’. Viewable in both VR and flatscreen mode on desktop and mobile, Inside Music lets you toggle the orbs on and off, something that helps you pinpoint exactly where any given sound is coming from within the song.
The design of the app is fairly simple, featuring the pulsating orbs that give you a visual que. The real star of the show is the app’s positional audio though, which helps you dissect the song by letting you can tell where each stem is physically located, and whether it’s plugged in or not. With all of the stems coming at you from different directions, you can really hear when one is missing a much easier.
The brilliant part: the creators have thrown everything on GitHub so you can integrate your own songs, or others’ into the app. The application supports between 1-7 stems.
With ARKit, the augmented reality-based remote assistance application Remote AR, enables any enterprise to implement the most advanced AR functionality within their workforce today
SAN FRANCISCO, Sept. 8, 2017 /PRNewswire/ — Today Scope AR, the creator of augmented reality (AR) smart instructions and live support video calling solutions, announced support for ARKit, Apple’s much anticipated AR development platform. The company’s live support video calling application, Remote AR, will support ARKit as soon as iOS 11 launches, delivering the most advanced AR functionality yet to devices without AR-specific hardware.
“This is a game changer for any enterprise looking to implement the latest advancements in AR now,” said Scott Montgomerie, CEO of Scope AR. “With our technology, any company can use an existing iPhone or iPad to implement AR within their workforces today, allowing workers to complete tasks faster and more accurately, while also producing significant cost and time-savings. While there are many apps coming to ARKit that will inevitably bring AR to the masses, we’re the first solution leveraging ARKit that is truly impacting the bottom line for enterprise.”
With support for ARKit, Remote AR users can now take advantage of the platform’s sophisticated real-world mapping to collaboratively add annotation and 3D content to a much larger area than has previously been possible on standard devices. This results in a simple, seamless work session for both sides of the call. The capability is available on newer iOS devices, including the iPhone 6s and above, as well as iPads equipped with A9 or A10 processors. You can see a video with more details on how Remote AR can be leveraged with Apple ARKit here.
Remote AR delivers the ability to save time and money, as well as improve knowledge transfer and retention by combining AR with live video streaming, voice, 3D animation, screen sharing, whiteboarding and world-locked annotations. Doing so simulates the effectiveness of having an expert on-site guiding a worker step by step on what to do. Whether a technician needs live support for troubleshooting a problem or conducting maintenance or assembly procedures, Remote AR empowers them to get the knowledge they need, when they need it.
Remote AR is fully platform agnostic for use on Android, iOS, Windows and Tango devices simultaneously, as well as select smartglasses, allowing organizations to easily experience the benefits of AR by using their device of choice. All current Remote AR users will have access to the new ARKit support once iOS 11 is available.
About Scope AR
Augmented reality (AR) company Scope AR is the creator of the first-ever true AR smart instructions and live support video calling solutions – WorkLink and Remote AR, as well as the new heavy industry focused CAT® LIVESHARE. The company provides the industry’s most comprehensive set of tools to allow users to access or become their own experts, learning to assemble, repair or troubleshoot problems wherever they are. Whether training, performing complex fieldwork or remote tasks, or any number of assisted activities across vertical industries including industrial equipment manufacturing, aerospace, construction, utilities, oil and gas, automotive, consumer applications and more, Scope AR provides robust solutions for in-field support and performance tracking. The company’s partners and users include Caterpillar, AstraZeneca, Lockheed Martin and Clicksoftware among others. The company was founded in 2011 and is based in San Francisco with offices in Edmonton, Canada.
Apple’s iOS 11 includes ARKit, developer tools that make it surprisingly easy to build great quality augmented reality experiences. This could have an impact in the automotive world, since it makes it a lot easier to bring virtual showroom experiences to potential car buyers and automakers via ordinary consumer devices.
Here’s an example of what that could look like: Developer Augmently created this AR experience using the developer preview ARKit tools in iOS 11. It features a detailed, rendered 3D model of a Mercedes sedan, complete with an interactive interior and the ability to swap out different paint color options. The model stays firmly rooted in place, allowing a user to virtually walk around it and check out all angles.
Virtual showrooms and AR demonstrations are now standard fare for automakers at big auto shows, and even in dealership settings sometimes with specialized hardware. But this use of ARKit brings it down to Earth – it’s instantly available to anyone with an iPhone.
Car sales models are already being shaken up by startups – Tesla sells its vehicles exclusively online, using showrooms only for in-person looks at its cars. Online car-buying startups are increasingly attempting to complement and supplant the existing brick-and-mortar sales model, too, and even Amazon has experimented with online sales, particularly in Europe.
ARKit replaces a lot of kludgy, hard-to-use and less-than-perfect solutions for doing similar things, but it’s the system-level tools and quality of experience that could help it up the game for automobile sales, demonstrations and showrooming. AR isn’t the only thing ARKit could make mainstream – online vehicle sales could benefit big time, too.
Over the past few weeks I’ve been steeping myself in the developer and investor community that is quickly sprouting up around ARKit.
There are a variety of reasons that people are excited about the possibilities but it’s safe to say that the number one positive that’s shared by everyone is the sheer scale of possible customers that will be able to experience augmented reality on day one of iOS 11. Hundreds of millions of potential users before the year is out is a potent pitch.
I’ve seen some very cool things from one- and two-person teams, and I’ve seen big corporate developers flex their muscle and get pumped about how capable AR can be.
At a round robin demo event yesterday with a bunch of developers of AR apps and features, I got a nice cross-section of looks at possible AR applications. Though all of them were essentially consumer focused, there was an encouraging breadth to their approaches and some interesting overall learnings that will be useful for developers and entrepreneurs looking to leverage ARKit on iOS.
Let me blast through some impressions first and then I’ll note a few things.
What it does: Allows you to place actual size replicas of IKEA sofas and armchairs into your house. 2,000 items will be available at launch.
How it works: You tap on a catalog that lets you search and select items. You tap once to have it hover over your floor, rotate with a finger and tap again to place. The colors and textures are accurately represented and these are fully re-worked 3D models from IKEA’s 3D scans used for its catalogs. It looks and works great, just as you’d expect. IKEA Leader of Digital Transformation Michael Valdsgaard says that it took them about 7 weeks or so, beginning slightly before Apple’s announcement of ARKit, to implement the mode. It will be exclusive to iOS for now because it’s the largest single target of AR capable devices. I asked Valdsgaard how long it took to get a first version up and running and he said just a couple of weeks. This has been a holy grail for furniture and home goods manufacturers and sales apps for what seems like forever, and it’s here.
Food Network In The Kitchen
What it does: Lets you place and decorate virtual desserts like cupcakes. Allows you to access the recipe for the base dessert.
How it works: You drop a dessert onto a surface and are provided with a bunch of options that let you decorate a cupcake. A couple of things about this demo: First, it worked just fine and was very cute. A little animated whale and some googley eyes topping a cupcake which you can then share is fine. However, it also demonstrates how some apps will be treating AR as a “fun extra” (the button is literally labeled “Fun”), rather than integral to the experience. This is to be expected in any first wave of a new technology, but examples out there like KabaQ show that there are other opportunities in food.
What it does: Allows you to place gifs in 3D space, share videos of them or even share the whole 3D scene in AR with friends who have the app. They can then add, remix and re-share new instances of the scene. As many people as you want can collaborate on the space.
How it works: You drop gifs into the world in the exact position you want them. A curated and trending mix of gifs that have transparency built into them is the default, but you can also flip it over to place any old Gif on the platform. Every scene gets a unique URL that can be remixed and added to by people that you share it with, effectively creating a shared gif space that can be ping-pinged around. The placement of gifs felt very logical and straightforward, but the ability to “paint” with the gifs and then share the scenes whole in a collaborative fashion was a pleasant surprise. One example that was impressive was leaving a pathway to a “message” that a friend could follow when you shared the scene to them. Ralph Bishop, GIPHY’s head of design, says that the app will be free like their other apps are but will have branded partners providing some content. GIPHY has something interesting going on here with a social AR experience. It’s early days but this seems promising.
What it does: It’s a game from Climax Studios that places a (scalable) 3D world full of crumbling ruins onto your tabletop that you help your character navigate through without any traditional controls.
How it works: You look through your device like a viewport and align the perspective of the various pathways to allow your character to progress. There are no on-screen controls at all, which is a veryinteresting trend. Climax CEO Simon Gardner says that translating the game into AR was attractive to the studio (which has been around for 30 years) was the potentially huge install base of ARKit. They’re able to target hundreds of millions of potential customers by implementing a new technology, which is not the typical scenario where you start at effectively zero. The experience was also highly mobile, requiring that you move around the scenes to complete them. Some AR experiences may very well be limited in their use or adoption because many people use phones in places where they are required to be stationary.
The Very Hungry Caterpillar AR
What it does: Translates the incredibly popular children’s book into AR.
How it works: The story unfolds by launching the app and simply pointing at objects in the scene. We saw just a small portion of the app that had apples being coaxed from a tree and the caterpillar scooching its way through them to grow larger. This was my favorite demo of the day, largely because it was cute, clever and just interactive enough for the age level it is targeting. It’s also another ‘zero controls’ example, which is wonderful for small children. Touch Press CEO Barry O’Neill says that they’ve seen some very interesting behavior from kids using the app including getting right down at eye level with the tiny caterpillar — which meant that they really had to up-res the textures and models to keep them looking great. Now that ARKit enables capturing any plane and remembering where objects are (even if you move 30-50 feet away and come back), storytelling in AR is finally moving beyond marker-based book enhancements. Any surface is a book and can tell a story.
The Walking Dead: Our World
What it does: It’s a location-aware shooter that has you turning in place to mow down zombies with various weaponry.
How it works: The scene I saw looked pretty solid, with high resolution zombies coming at you from all angles, forcing you to move and rotate to dodge and fend them off. You progress by “rescuing” survivors from the show which provide you with unique additional capabilities. Environmental enhancements like virtual “sewers” that walkers can crawl up out of give each scene a unique feel. It looked fast and smooth on a demo iPad. AMC and Next Games collaboratedon this title. There were some additional fun features like the ability to call up various poses on a survivor like Michonne and stand next to them to take a selfie — which felt super cool. The best kinds of IP-based games and apps will focus on unlocking these kinds of “bring that world into your world’ experiences” rather than cookie cutter gameplay.
Some interesting notes:
Every app had its own unique take on the ‘scanning’ process that allows ARKit to perform plane detection before it can begin placing objects into the world. Basically a few seconds where you’re encouraged to move the phone around a bit to find flat surfaces and record enough points of data to place and track objects. It’s not onerous and never took more than a few seconds, but it is something that users will have to be educated on. Ikea’s conversational interface prompted people to “scan” the room; The Walking Dead suggested that you “search for a survivor” and Food Network’s app went with a “move around!” badge. Everyone will have to think about how to prompt and encourage this behavior in a user to make ARKit work properly.
Aside from the apps that are about placing objects directly into the scene, there is a focus on little-to-no on-screen controls. For Arise, your perspective is the control, allowing you to get an alignment that worked to progress the character. There are no buttons or dials on the screen by design.
The Very Hungry Caterpillar’s control methodology was based on focus. The act of pointing at an object and leaving your gaze on it caused the story to progress and actions to be taken (knocking fruit out of a tree for the caterpillar to munch on or encouraging it to go to sleep on a stump). Most of the other apps relied on something as simple as a single tap for most actions. I think this control-free or control-light paradigm will be widespread. It will require some rethinking for many apps being translated.
Incredibly short, all things considered. Some of the apps I saw were created or translated into ARKit nearly whole sale within 7-10 weeks. For asset-heavy apps like games this will obviously be a tougher ramp, but not if you already have the assets. GIPHY World, for instance, places a bunch of curated gifs that look great floating in the world at your fingertips, but you can easily drop regular gifs in there from their millions of options.
Models that Touch Press used for its previous Caterpillar app had to be upscaled in terms of complexity and detail quite a bit because they fully expect children to experience them in distances as close as inches. IKEA also had to update its models and textures. But given that the onramp is measured in days or weeks instead of months, I’d expect to see a large number of apps supporting ARKit experiences at the launch of iOS 11 in September. And for a bunch to follow quickly.
Today, Google is announcing ARCore, a software-based solution for making more Android devices AR-capable without the need for depth sensors and extra cameras. It will even work on the Google Pixel, Galaxy S8, and several other devices very soon and supports Java, Unity, and Unreal from day one. In short, it’s kind of like Google’s answer to Apple’s ARKit.
But that isn’t how Clay Bavor, VP of Augmented and Virtual Reality at Google, would describe it. Instead, when the topic came up, he reminded me that Google Tango had its first development kit all the way back in 2014 and that they’ve slowly been building towards this vision of a future where AR is democratized and available to millions around the world. Specifically, Google wants 100 million AR-capable Android phones within just the next few months.
“I like to call it immersive computing to sidestep some of the jargon and acronym debate — VR, AR, MR — just everything. Integrating computer-generated imagery seamlessly into experiences is what it’s all about,” Bavor explains at the beginning of our interview at Google’s San Francisco office last week. “Our goal here is to make AR mainstream on Android for developers and for consumers…We thought mobile smartphone-based AR was going to be a thing that was important years ago. The first Tango development kit was 2014 and by relaxing the constraints on the hardware, getting rid of the need for a depth sensor or additional tracking cameras we’ve honed in on our aim to prove out the technology and show the world that on consumer-grade sensors you can do really powerful AR experiences.”
From the demos I got the chance to try he’s not exaggerating. On standard Google Pixel and Samsung Galaxy S8 phones I watched robots walk across a tabletop and wave at me, trees shrink and grow from a few inches to several feet, and even a giant lion flex his muscles and look down at me as if I was really there. Similar to the first time someone tries VR, a powerful AR experience can feel like magic.
“There’s a lot of things that need to happen to make it successful though,” Bavor admits. “We’ve always known that it’s got to work at scale, so we’ve been investing in software-only solutions like ARCore, building on all of the Tango technology, just without the additional sensors. We feel that the technology is ready and we’ve figured out some of the core use cases for AR that we’re really excited about, so that’s why we’re so excited to get ARCore out there and start lighting up across the ecosystem.”
One great use case that I got to see first-hand is the dynamic good AR can bring to shopping. Using a plugin on the Wayfair website I watched as a room was measured in real-time (similar to the GIF above) and a chair was placed from the website into the physical space. Imagine applying this same concept to other types of shopping and interior design as well.
Another future example Bavor gave was through the use of VPS (Visual Positioning Service.) “We’ve been investing in a constellation of tools and services and applications around it to make it even more powerful for developers,” Bavor says. “One example is VPS. You’re gonna want, as a developer, to extend beyond just the tabletop or room to something that’s world-scale, or to anchor things in the world that persist so you can go back to them. ARCore and VPS we see as very natural partners and in fact we were building VPS in anticipation of scaling AR on Android with ARCore.”
Imagine being able to return to a specific building and see a sign in AR that has aged and rusted with years passing by. Or know where your friends recommend eating downtown just by looking around — Google Lens could be a big part of that too. It’s the type of stuff that’s been promised and imagined for a while, but we’re getting closer. We’re not there yet, but it’s an attainable goal in our lifetime.
“Another example, which is especially relevant for developers that build traditional smartphone apps in Java, is that we want to make it easier than ever for people to get into 3D modeling that haven’t done it before,” Bavor says. “We know there are a lot of people that want to get into 3D development and AR but aren’t experts in Maya, or Unity, or anything. So Blocks is an app we built with the intention of enabling people that have never done a 3D model in their life to feel comfortable building 3D assets. We even made it easy to export right from Blocks and pull into ARCore apps you’re developing.”
One of the demos I tried, the same one with little robots and trees on top of a table, had all of its assets created directly inside of Blocks and exported to ARCore.
More details will likely emerge about ARCore in the coming months and we can’t wait to see what intrepid AR developers are able to cook up with this new suite of accessible tools. Let us know what you think of Google’s ARCore, ARKit, WebAR functionality, and everything else in the world AR down in the comments below!
Microsoft debuted the lengthily-named “Windows Mixed Reality motion controllers” back in May, but until now we haven’t had a chance to actually try them out. During a recent meeting with Microsoft in San Francisco, I got to try the VR controllers for the first time paired with the Acer Windows VR headset.
Microsoft’s VR controllers are designed to let you reach into VR and interact naturally with the virtual world. With both a trackpad and a thumbstick, they look like a crossbreed of the Oculus Touch controllers and the Vive controllers.
In addition to the trackpad and thumbstick, there’s also a menu button and a Start button, as well as a grip button along the handle. The big circular parts on the front contain an array of LEDs which provide bright markers for the headset’s on-board cameras to detect and track. Microsoft tells us that the shipping version of the controllers will indeed use visible-light, just like we’ve seen in renderings and promo videos. (Microsoft didn’t allow any pictures of the controllers during my hands-on time).
Buttons and Inputs
Grabbing the controllers for the first time, they didn’t feel quite as elegant as either Touch or the Vive controllers. The odd side-by-side trackpad & thumbstick arrangement is useable, but seems to effectively put neither of the two in an ideal position for your thumb. The grip button is indeed a binary button (rather than being pressure sensitive), and doesn’t feel so much like a “grab” as it does a clicky button press with your palm.
Though they resemble Touch with their ring-shaped tracking appendages, the Windows motion controllers are actually noticeably larger and clunkier thanks to the placement of the tracking rings, which don’t encompass your hand like Touch, making the controllers easier to bump together, especially when their physical outline is hidden in VR.
The shape of the rings is necessary though, as they need to present a substantial surface area from which the headset’s on-board cameras can track their movement. Though I was using the Acer dev kit headset, our understanding is that these controllers will work with any of the soon to be released Windows VR headsets (all of which feature on-board cameras).
Pros and Cons of Inside-out Controller Tracking
This method of controller tracking differs from both the Rift and the Vive in that it’s the cameras on the headset which are watching the controllers to track their movement (which is called ‘Inside Out’), whereas the Rift and Vive both use external sensors to track their controllers (called ‘Outside In’).
The upside to this approach is that you don’t need to set up any external trackers, but the downside is that the controllers must always be in view of the headset’s front-facing cameras to be properly tracked. Thankfully, the size of the tracking volume felt reasonable; for basic use (like reaching out in front of me to grab virtual objects), I didn’t feel like my reach was artificially limited by the camera’s field of view.
Outside of the Box Tracking
And for times when your hands will go out of the camera’s field of view, Microsoft is doing its best to compensate. When that happens, the system relies purely on the controller’s on-board IMU to estimate positional movement until it reappears in the camera’s view. This works well enough for quick jumps in and out of the camera’s view, but after a second or two, the IMU-only tracking estimation is too unreliable, and it appears that the system will eventually freeze the location of the controllers in the air and only feed them the rotation data from the IMU, though they snap quickly back into their proper place as soon as they’re brought back into view. It remains to be seen how much this limitation (the need to be seen by the front-facing cameras) will impact different VR games and apps, and how effectively it can be designed around.
As for the tracking accuracy when they are in sight of the camera, I did see some jumpiness here and there—especially if I was rotating my body while moving the controllers—but on the whole they seem entirely usable, and (in my short time with them thus far) to be more accurate than the PlayStation Move controllers.
As part of my testing, I played Arizona Sunshine (2016), and found that guns were steady when I held them out in front of me and aimed down their sites; I didn’t have any trouble landing zombie headshots. Granted, for inside-out controller tracking, holding a gun up in front of me to aim is pretty much the best case scenario—I’m curious to see how other common input modalities hold up (like shooting a bow and arrow or swinging a sword).
The occasional jumps didn’t present much issue in a shooting scenarios, but for more precise uses, like VR drawing, painting, and animating, it remains to be seen if those jumps will cause any usability issues.
– – — – –
Microsoft says that the Mixed Reality motion controllers will be bundled with Windows VR headsets starting at $400 this holiday.
According to Aaron Stanton, director of the newly-founded Virtual Reality Institute of Health and Exercise, many of the popular VR games and applications are actually more effective at burning calories than a traditional treadmill exercise routine. The organisation recently announced a program to independently assess VR games in a controlled environment, with the aim of publishing the results through a rating system.
Back in 2006, Nintendo’s Wii console popularised motion controllers, enticing people off the couch for a spot of ‘arm flailing’ with its addictive title Wii Sports. Although there were successful fitness titles on the platform, most gamers quickly discovered that the basic accelerometer-based motion sensing of the Wii Remotes could be easily fooled with short flicks of the wrist rather than full swings, and in many cases the seated ‘waggle’ became the more effective way to play.
In VR however, fully-tracked motion controllers mean that it is impossible (assuming a properly designed game) to ‘cheat’ the system in this way, and players really do need to perform appropriate motions to succeed in virtual sports and action games. Racket: Nxand Rec Room’s Disc Golf are great examples, as similar racket and disc-throwing games on the Wii could be easily played while seated, with just a slight flick of the wrist. Try that in VR, and you’ll struggle to play, let alone win anything. The games expect and reward physicality, and in any case, the level of immersion in VR is so much higher that players are naturally more compelled to move. Thus, the potential health benefits of VR is likely much higher than traditional gaming.
“We created the program after realizing that I had worked out in VR for more than a hundred hours without realising it,” writes Aaron Stanton, director of the Institute, in a message to Road to VR. “We later confirmed through metabolic testing that many VR titles are better calorie burners than a traditional treadmill or elliptical, making my VR system the best exercise equipment I’ve ever purchased. I believe VR has the potential to have a life-saving impact on society. That’s why we created the VR Health Institute.”
The Virtual Reality Institute of Health and Exercise recently began publishing their initial ratings for a number of popular titles on their site, sorted by Metabolic Equivalent of Task (MET) range. The intention is for the VR Exercise Ratings labels to become similar in function to ESRB content ratings, giving a way to see the potential health benefits of a particular title at a glance. The Institute suggests that VR games can be especially effective because they are compelling to play, meaning that users end up doing significant exercise without thinking about it.
Some of the results are surprising; the relaxed paint app Tilt Brush is apparently a similar calorie burner to the intense shooter Raw Data, and both are in the ‘walking equivalent’ range, burning around 2–4 calories per minute. Then there is boxing game Knockout League rated in the 8–10 calories per minute range, yet Thrill of the Fight is the highest calorie burner by far, at ‘15+’ calories per minute, despite also being a boxing title, albeit a more realistic one. At this MET range, Thrill of the Fight is up in the ‘sprinting’ or ‘running up stairs’ zone; you might want to consider a sweat-resistant headset cover. You can see all of the game’s that the Institute has measured here.
With passion for immersive technologies we aim to provide you with everything you need to start and grow as a developer. We also provide custom application development for businesses. Contact us via the form below and one of our staff will be in touch with you shortly.