360 video could become the next big medium

Nils Forsblom
wrote this on

We’re on the cusp of a revolution in virtual reality (VR). And I’m not talking about headsets like the Oculus Rift and Samsung Gear VR — I’m talking about simple, accessible 360-degree video that anyone can watch just by holding up their smartphone and moving it around. Call it “minimum viable VR” — it’s an immersive, engaging, and a powerful storytelling medium that anyone with a smartphone can view, no special headsets required.

With Pokémon Go, we’ve seen how augmented reality (AR) can take the world by storm using nothing more complicated than a GPS- and camera-enabled smartphone. The same thing could happen with 360 video, which provides a “window” into another world via your smartphone.

But for 360 video to realize its potential, we need more cooperation. Currently, the companies enabling 360 video are too busy trying to own it and not spending enough time creating common standards. That could wind up killing this promising medium before it has a chance to take off.

I know about this because my company, Adtile Technologies, has spent the past year working on 360 video. We’ve run into roadblock after roadblock, technical challenge after technical challenge. Here’s what I’ve learned.

Format confusion reigns

Right now within the industry, a variety of formats are being used for VR—and within those formats, several different codecs are used for transmitting, storing and playing the data. For example, the built-in formats of 360 degree videos on YouTube are mostly MP4 and Webm, and those on Facebook are MP4 videos in MP4 codecs.

Image quality is also a big part of VR. Eventually everyone is going to want to create their video in ultra-high definition (4K) to give the best quality on any given device. But no standard exists for how apps deal with different resolutions. Oculus Rift and Samsung VR both do 4K, but what file formats will ultimately support, say, 12K resolution?

360 video brings another twist to the standards battle: sensors and controls. Because moving your smartphone changes the viewing angle of a video (as opposed to your mouse, if you are viewing on a desktop), the format has to interact consistently with the different sensor implementations on different devices.

Distribution options are limited

As with any content, distribution is your key to reaching the masses. Standards open the door to wider distribution, because when everyone is using the same standards, your video plays on any app. But as it stands now, if you’re a consumer, and you want to upload a 360 video onto the Internet and share it with the masses, only one platform will do that for you: YouTube.

With over 1 billion users, YouTube is your greatest chance at reaching critical mass. But right now YouTube is limited to certain browsers. If you want to watch it on mobile, you’ll need to download the native Android or iOS YouTube app for that.

If you are looking for cheap options for hosting your VR video, you’ll find several free sites to do that. But bear in mind these services rely on advertising to keep them afloat, so you’ll likely have to contend someone else’s branding on your stuff.

Camera manufacturers offer their own shoot-and-share apps. But in most cases, you can only upload your 360 videos on a dedicated app owned by the manufacturer of your particular video camera. The notable exception is Samsung VR, which will let you upload any VR video.

Another option is to build your own player using WebGL to give users access to the content you create without being limited to a certain platform. But that gets expensive, and then where do you post your video when done? Once again, you can only upload on Youtube.

Cooperation is better than a standards war

Hardware vendors simply aren’t motivated to settle on a standard, because, obviously, when the content only works on their app or device, that gives them competitive advantage. But this view is painfully short-sighted.

Industry players need to work together to create and support common standards. If they don’t, expect a standards battle to ensue in the not too distant future. If you’re not sure what that can entail, just look at what happened in the war over high-definition DVD standards, where Sony’s Blu-ray (backed by Warner Bros.) triumphed over Toshiba’s HD-DVD. The price of that battle was years of uncertainty, and a delay in widespread acceptance of high-definition video.

Instead of an adversarial battle like that, I hope that eventually content creators (the people who make games, educational content, entertainment videos, and the like) will get together with hardware leaders (like Samsung and Oculus) to hash out their differences in a friendly and mutually beneficial way.

If they do, 360 video has a chance to take off as explosively as Pokémon Go – as a creative medium for storytelling, branding, creativity, and education. And it could happen right now, with inexpensive, currently available hardware.

If they don’t, you’ll just have to wait several years to enjoy VR — and by then, it’ll probably be through an Oculus headset, not a smartphone.

Adtile announces US Patent Office Grant

Nils Forsblom
wrote this on

Adtile Technologies, a pioneer and developer of motion-sensing technology for smartphones and tablets, announced that it has been awarded U.S. Patent No. 9,401,100 for selective map marker aggregation, which it invented in 2011. This location-based map marker technology has now become a standard and is widely used by some of the most popular mobile applications available today such as Photo Apps, Social Networks, Live-video streaming Apps, and many others.

Before Adtile developed this technology, mapping applications would show hundreds of pinpoints of information on a map, rendering the content useless and making it impossible to discover any information, particularly relevant information. It is now possible to show content such as photos, videos, live-stream videos, GIF’s, tweets, or emojis, plotted on a map in a smartphone app under one larger content marker showing the number of data points such as photos in that area. As you pinch and zoom in or out of the map, it will show more specific locations, or the data points will aggregate and show as separate pieces of content, saving space and allowing for better accessibility and discovery of the location-based content.

For example, with many photo management apps, you can see the photos you took on your phone on a map by tapping on the location header. All of the pictures from that “moment” or section of photos will appear on a map. You can then zoom in or out to see the pictures in more specific or more general locations on your map.

“In 2011, location-based mobile technology was still in its infancy compared to what we are doing today with the device sensors,” said Nils Forsblom, founder and CEO of Adtile Technologies. “I’m honored that Adtile’s Selective Map Marker Aggregation Technology has gained such huge popularity in the developer community in an effort to build apps that deliver delightful and useful experiences to users.”

This is one of the many US patents Adtile had been granted this year, including the recent issuance from the U.S. Patent Office for US 9,256,358, US D751,574, US D752,062 and many other approvals from the European Union Intellectual Property Office.

360° video is a huge opportunity for brands

Nils Forsblom
wrote this on

360-degree video is a huge opportunity for brands—if they’re willing to give up control

Social media has put consumers in the driver’s seat, eroding brands’ ability to control their own messages directly. Now, a new medium will take that trend even further: immersive, 360-degree video. With 360 video, brands will need to learn how to create engaging, open experiences where they’re not fully in control. For those that can pull it off, the rewards will be immense.

At its heart, 360-degree video is a form of virtual reality. It puts the viewer inside a spherical ball to get the “big picture” of what’s happening in all directions — up, down, and all around. Its biggest potential will be on mobile, where instead of clicking on arrows or manipulating a mouse to pan a scene like you do on a desktop, you simply move your smartphone or tablet out in front of you to change the viewing angle, resulting in a more seamless, intuitive experience. It effectively makes your phone into a small, movable window into a virtual world.

Don’t underestimate the power of 360 video — consumers are going to love it. If you’re watching a political event (like this one), you don’t have to keep your eye on the candidate. You can take a visual tour of the scene, swirling a full 360 degrees to gaze out at the audience or zero in on a reporter—to examine their responses or pick up on something odd, quirky, or highly relevant, happening “out there” in a realm that was previously off camera.

Getting this new medium right takes a lot more than plunking down a spherical camera rig into any old scene. To work, 360 has to be inspiring, like a piece of art. It has to elicit emotion and tug at the viewer’s innate sense of curiosity and give him or her a reason to explore — and that’s not so easy to pull off. But when you capture the right adventure, the panning itself tells a deep, more intriguing story. Take a look at this Jungle Book trailer taken through Mowgli’s eyes, and tell me you don’t want to look around to find out where that scary voice is coming from.

While 2D videos often cut from scene to scene, sweeping the viewer along a controlled, preset narrative, 360 might unveil a solitary, but rich experience that you can watch over and over again, taking in different nuances each time. Movies are already taking advantage of 360. Take for instance this Warcraft movie trailer or this Star Wars one. But a lot more can be done with the medium. National Geographic released a 360 video on swimming with bears. The BBC is doing an entire TV series in 360 called BBC Click. Tourism industries, especially, will find 360 useful in selling hotel rooms or letting a consumer know what it’s like to stand on a pristine, isolated beach, because 360 is all about making you feel like you’re there.

Ultimately for brands, 360 video means giving up the “director’s vision.” You can no longer direct the viewer’s attention. You simply need to capture that one compelling event and let the viewer take over from there, keeping in mind that everyone may experience the same 360 video differently.

As for headsets, they are unlikely to have much impact in the long run. As Google Glass demonstrated, people don’t like to wear funky things on their heads when they’re around other people. And as even Apple watch is showing us, when you already have a screen on your phone, what is the point in having another? Aside from education purposes, headsets are impractical for day to day living.

Right now, some of the 360 videos out there still look a little low res, but once the technology catches up (in terms of computer hardware, camera technology, video codecs, media players, and bandwidth availability) we’ll begin to get really clear, high def pictures of what is happening all around the video camera — as well as seamless transitions between viewing angles in near real time.

There is no doubt this captivating new medium is here to stay. Brands who start getting used to 360 now will reap the rewards of unparalleled user engagement and gain early dominance of a medium. Now is the time to experiment, learn, engage, and to show leadership.

Introducing Adtile 360

Nils Forsblom
wrote this on

360-degree video, which puts the viewer smack dab in the center of the action, opens up a realm of possibilities for brands wanting new ways to reach consumers on mobile. The trick is, how do you design those experiences with minimal hassle so they run in a broad range of environments?

Sure, Facebook and YouTube let you show 360 videos, but only mobile in-app and on their terms. They don’t allow you to reach consumers outside of their own platforms.

Traditionally, if you wanted to run your 360 campaign on tens of thousands of different apps and websites from different publishers, you needed to create different versions of the ad to fit the platforms and code used to build the apps they’ll run in. You can imagine how much that costs - and the quality can be a hit or miss.

Adtile has come up with a better solution. With our new Adtile 360 we let you create 360 videos that run both in app and on mobile web and on any standard mobile format. Adtile 360 is available through our Motion Store, where you create ads based on Adtile-designed templates, eliminating the iteration and experimentation typically involved with building rich media experiences of any kind.

Motion Store templates are built using open web technologies, which working like a self-contained website, allowing your campaign to run seamlessly across multiple apps and platforms, without having to worry about the native code of whatever app it runs in.

Adtile 360 works like this: You create your own virtual reality video using a 360 degree camera (such as Ricoh Theta, Kodak SP360, or Giroptic 360cam) to record footage from every direction at the same time. You then go to the Adtile Motion Store, select a pre-designed template and upload your 360 video. Next, you add your own logo, graphics, and interactive objects—emojis, images and animations that offer additional information or link to other websites or videos—so what you end up with is a unique, one-of-a-kind application like experience, different from any other. That’s it. That’s all you have to do.

You simply request your standard mobile Web and/or MRAID tags, and the Motion Store platform takes care of the rest. You don’t have to worry about whether or not the ad will render on different devices or screen sizes—it will.

And analytics are baked right in. You can go to a campaign dashboard to get detailed information on how well your campaign is performing in real-time. And you can make changes to your ad at any point in the campaign.

What’s more, Adtile 360 videos are fully interactive. You control the viewing position by moving your mobile device around you. It’s like being in the center of the action but instead of moving your head, you are moving the device itself. It’s a virtual reality without the headset.

360 video has never been so smooth. Adtile’s Motion Technology makes sure interactions–such as rotating the device, pitching the device, tilting the device–are as smooth and as seamless as possible, with near-zero latency. Accurate orientation and motion processing are accomplished by combining outputs from at least three sensors: the accelerometer, the gyroscope and the magnetometer, which are embedded directly into Adtile 360 to create a one of a kind motion accuracy, quality and scale not seen before in 360º videos on mobile browsers.

We’ve already tested our 360° video ads on hundreds of devices, debugging our software and ensuring it will work for you. In fact, Adtile 360 currently supports over 600 software and hardware configurations.

With more people moving to mobile, brands need to find inventive ways to enrich the user experience. Adtile 360 sensor-enabled ads will breath new life into your mobile experiences, so you can reach your customers, regardless of what apps they have on their phone.

We are looking into interactive formats beyond 360, VR/AR and livestreaming videos. Adtile 360 is our video technology foundation. It’s just the beginning.

If you want to keep up with the latest developments, follow us on Twitter or send us an email to info@adtile.me .

Mobile 360° videos may be the holy grail

Nils Forsblom
wrote this on

Mobile 360° videos may be the holy grail

Mobile 360° video is giving viewers a new perspective of what future videos can be like. But the technology is still in its infancy. By overcoming a few technical challenges, like how to create software that can detect gestures and motions with greater accuracy for a more seamless viewer interaction, 360° video has the potential to open the doors to new levels of engagement for app developers, media outlets, advertisers, and most importantly to the consumers.

Before we get into that, let’s cover some background. The appeal of 360° videos (like this one of a dinosaur or this one of the Golden Gate Bridge) is they put you smack in the center of the action. You’re in charge. You can pan around to look up, down, behind you, and discover new details every time you watch a video. The viewer is no longer a passive observer, but actively engaging with the medium.

If you’ve got a headset like the Google Cardboard or Oculus, you can engage by moving your head to pan a scene. If you don’t have a headset and you’re watching from a desktop, you can use your mouse to pan the virtual world. Or, if you are on a smartphone, you can simply rotate, tilt, or otherwise move your device to interact with the 3D world before you — all of that is powered by sensors in the mobile device.

Since this article about “lightweight VR” appeared in VentureBeat last year, 360° videos have gone mainstream. Now, social media titans YouTube and Facebook both offer capabilities to upload and share virtual reality videos with the world at large.

Anyone with the right equipment can create their own videos. You just need a spherical camera rig (outfitted with six or more cameras) to capture what is happening from all directions. Next, you need a software like Stitch that can turn the individual videos into one seamless, panoramic view. And don’t forget to add your own metadata. (Some cameras do this for you automatically, but not all.)

For advertisers, 360° video is a watershed opportunity. Over the years, consumers have been bombarded with so many banner ads, 2D videos, and the like that now it’s almost impossible to get through to them. Overcoming ad desensitization is a growing problem for marketers and 360° video may be the solution. The new medium has huge potential to breath life back into advertising through engagement. Because when a consumer engages with an ad, you know you’ve got their full attention.

Now the question for the advertiser becomes: How do you keep 360° videos interesting? How do you create new interactions that are quirky and stimulating enough to grab and hold a viewer’s attention? The answer: make those interactions application like. That requires overcoming a few major technical challenges.

Hurdle #1: Improving the framework for broad distribution

Right now, distribution on 360° video is not great, because the current formats are not designed for cross devices and browsers. As mentioned, the two leading ways to browse, discover, and view 360° video content are through YouTube and Facebook. Both have limitations. On a desktop computer, for example, only certain browsers support 360° videos. Chrome and Opera are compatible, but Safari and Firefox are not. The perspective controls via the browser (which involve clicking and dragging on the video while it’s active) are also less intuitive and immersive than the headset or motion controls using a mobile device.

Additionally, on a mobile device, you can only view 360° videos on Facebook and YouTube in-app. This makes it difficult to reach a broad range of viewers. It is very difficult to create an app that works perfectly on a vast range of devices and browsers — in fact, it’s impossible due to poor hardware and/or software quality. Apps are always limited to working well in some environments and screen sizes, and not so well in others.

Hurdle #2: Getting better data from motion sensors

When it comes to viewing on a mobile device, another challenge with interactive 360° video is making sure interactions (such as rotating the device, pitching the device, tilting the device, or walking with the device) are as smooth and as seamless as possible, with minimal latency. If you rotate your smartphone to view your virtual world from a different angle, you want the video to move as you move.

When you begin to interact with a 360° video on the level of a game, the quality of motion processing technology becomes paramount. The only way to obtain accurate orientation and motion measures are by combining outputs from at least three sensors: the accelerometer, the gyroscope and the compass. All three come innate in most modern mobile devices. The problem is these sensor readings are often inadequately accurate when used separately and result in poor user interactions.

Hurdle #3: Inventing cool new interactions to engage the viewer

One of the ways brands can make 360° videos more interesting to viewers is to incorporate creative and new types of engagements. For instance, a user might interact with objects in a 360° video that can allow the user to engage in a multitude of distinct scenes and different worlds, creating countless unique experiences. Or, brands can incorporate new types of motions in their experience so that, for example, taking steps in the physical world corresponds to the user stepping through the virtual world. Haptic or multi-dimensional feedback can also be used to guide the user to interact with a 360° video using a particular gesture or in another specified way. Or, the user might be able to interact with the video space using the Air Pencil. The 360° video might even give us a whole new social networking medium. The possibilities are endless and you can come up with lots of ideas just by paying attention to what’s happening in the mobile hardware and VR hardware space.

But the key to developing any new type interactions lies in tapping into the innate sensors in a mobile device to detect a viewer’s motion with a high degrees of accuracy—and then make sure the virtual world responds seamlessly to those actions. All of this requires high quality motion sensor processing framework, machine learning, sophisticated algorithms, and design that are all carefully tied into one beautiful user experience.

We are not there yet, but the future of 360° videos and immersive content is very promising. And it may bring a lot more inspiration to mobile applications and advertising by engaging with consumers in a way that pulls them in, instead of putting them off.

If you want to keep up with the latest developments, follow us on Twitter or send us an email to info@adtile.me .

PS: We are working on something called “360° Motion Video” and “Flightpath” —You’ll hear more from us on that soon!

Galileo meets MEMS

Nils Forsblom
wrote this on

Adtile Air Pencil

Galileo, father of modern physics, meets MEMS

What does the father of modern physics have to do with the tiny MEMS motion sensors in your smartphone? Plenty. A mathematician known for his pioneering observations of nature, Galileo (1564-1642) was the forefather of much of what we know about motion today.

The ultimate disrupter of his time, Galileo challenged Aristotelian theories people held true for centuries. For example, Aristotle claimed a rock fell to the ground because the two were made of the same element: earth. In contrast, Galileo studied quantifiable entities like time, distance and acceleration to explain what makes objects fall, break, and bend.

Galileo showed that force causes acceleration. On the basis of the law of parabolic fall, Galileo found that bodies fall at a constant acceleration, and that gravity is a constant force. In other words, a constant force does not lead to constant speed but to constant acceleration. He also developed the concept of inertia, which states an object in motion only stops due to friction. A hundred years later, Isaac Newton (1642-1726) built on these ideas to develop his first law of motion.

Later researchers expanded on the discoveries of Galileo—and Newton— to develop mechanical devices like the gyroscope and the accelerometer, instruments that play a critical role in navigation systems. The problem was these early mechanical devices were bulky and expensive. So scientist continued working to make them smaller and smaller.

Today gyroscopes and accelerometers have evolved into tiny devices called MEMS (micro-electro-mechanical systems). Built on silicon wafers alongside the circuits that control them, these are the smallest machines ever made.

It is pretty amazing to think that some of the MEMs motion sensors today are rooted in the principles of physics discovered centuries ago. Let’s take a look at four motion devices on the smartphone, how they work, and their history.

Gyroscope

If you’ve ever played with a top, you know that a spinning top somehow has the power to stand upright. Without torque to change direction, a spinning wheel will always remain pointed in the same direction. A gyroscope is essentially a spinning top mounted on a gimbal, so the top’s axis is free to orient itself anyway it wants.

The first gyroscope was invented in 1852 by French physicist Leon Foucault to demonstrate the Earth’s rotation. Since our planet rotates, the gyroscope’s axis completes a full rotation once every 24 hours.

Electric motors in the 1800s made it possible for gyroscopes to spin indefinitely. And today, gyroscopes play an essential role in inertial guidance systems on ships and aircraft.

Many types of MEMS gyroscopes exist. Each type has some form of oscillating component for detecting directional change. All MEMs gyroscopes take advantage of the (Coriolis effect)[https://en.wikipedia.org/wiki/Coriolis_effect. And typically, the one on your smartphone is a three-axis gyroscope for measuring roll, pitch and yaw.

Accelerometer

Have you ever wondered how your smartphone knows how you are holding it? It uses a device called an accelerometer, which measures g-force. The first accelerometer was invented by George Atwood in 1783.

If you want to know how an accelerometer works, picture a box with a metal ball inside suspended by a spring. If you move the box up, the ball lags behind, stretching the spring. By measuring the force on the spring, you can measure acceleration.

While the MEMs accelerometer in your phone is more complex than the simple ball and spring model, it uses the same fundamental principles. Inside the chip, engineers have created a tiny triple-axis accelerometer out of silicon that measures acceleration in the x, y, and z dimensions, so your phone always knows which way is down.

Magnetometer compass

Used for navigation, the traditional compass has a magnetic needle that points to the North Pole. The earliest compasses were most likely invented by the Chinese in 206 BC and used for fortune telling. Later, compasses were used for navigation so sailing vessels could set their direction without having to rely on the stars.

The magnetometer sensor in your smartphone doesn’t use a magnetic needle. Instead, it uses an analog transducer to create a miniature Hall-effect sensor that detects the Earth’s magnetic field along the x, y and z axis.

The Hall-effect sensor produces voltage proportional to the strength and polarity of the magnetic field along the axis each sensor is directed. The voltage is converted to a digital signal representing the strength of the magnetic field.

The magnetometer is enclosed in a small electronic chip that often incorporates a three-axis accelerometer to determine which way is down.

GPS

The GPS receiver on your smartphone is a more modern invention—and technically it is not a sensor—but it works with the motion sensors in your device to more accurately determine movement and location.

Originally created by the US for military purposes, GPS (global positioning system) is a space-based navigation system made up of 24 satellites that circle the earth twice a day in a precise orbit. The satellites are spaced so that at least four are visible from any point on Earth at a given time.

Your GPS receiver requires data from at least four satellites to fix a position. It uses three satellites for trilateration to reduce your possible location to two points. The timing code from a fourth satellite is used to narrow down your location to one of those two points. (GPS satellites, by the way, tell very accurate time.)

Working together, these three MEMS sensors (gyroscope, accelerometer, compass) and the GPS receiver in your smartphone provide accurate data to determine navigation and a data on range of different user motions. Next time, you use your mobile device, you can thank Galileo, who opened up humankind to a new age of discovery and ultimately contributed to much of the smartphone’s innate intelligence.

Introducing Project Galileo.

The Adtile Technologies team is planning to release a cutting-edge motion processing dev kit (Project Galileo) that will allow developers to create new interactions and applications. Please sign up for more updates.