How Sensor-Enabled Ads Will Change Mobile Advertising

Nils Forsblom
wrote this on

How Sensor-Enabled Ads Will Change Mobile Advertising

When it comes to mobile advertising, not a whole lot new has happened in the last five to 10 years. Banner ads and standard full-page interstitials still reign, despite the fact they do little more than disrupt the user experience and leave people feeling annoyed.

The problem is, most of these ads were originally designed for the desktop. They don’t translate well to mobile. Banner ads become smaller and too easy to inadvertently click on. And people resent having their mobile browsers hijacked and taken over by ads.

Next year, mobile ad spend will top $100 billion, according to predictions by eMarketer. If companies want to grab people’s attention in a positive way, they will need mobile ads that offer a uniquely mobile experience. One way to create this type of ad is to tap into the innate intelligence already in most smartphones today.

What Your Phone Knows

Your phone already knows how you are holding it and where you are headed, in what direction and how fast. Your phone even knows when you are holding it up to your face to take a call. This awareness, and more, comes from sensors inside the phone.

Most modern phones are packed with dozens of tiny sensors. The sensors we hear about most include the accelerometer for detecting movement and orientation, the gyroscope for measuring rotation around three axes, a compass to detect magnetic north and a GPS to plot your position on a map. Together, these sensors open the door to a new, refreshing type of mobile advertising.

Ads That Take Two to Tango

you look at what people do on their phones, aside from email and texting, they spend a huge amount of time on apps. People love their mobile apps. They value them for their entertainment and utility. Apps are also seen as nonintrusive. So why not create ads that behave more like apps? If you look at what people do on their phones, aside from email and texting, they spend a huge amount of time on apps. People love their mobile apps. They value them for their entertainment and utility. Apps are also seen as nonintrusive. So why not create ads that behave more like apps?

That is the idea, at least, behind sensor-enabled motion ads. By tapping into the data from smartphone sensors, creating well-defined algorithms, motion ads can engage users in a whole different way. You will be asked to twist, tilt, bend, push, pull, shake, rotate or otherwise play with an ad.

Take for example, a motion ad with an image of a milkshake. The ad is minimalistic and visually beautiful. It asks you to shake your phone to blend your own milkshake. When you do, you get a coupon for the milkshake, which, by the way, you can redeem at the restaurant directly around the corner from where you are standing.

An airline is offering specials on flights from a nearby airport. By rotating your phone left or right, you can view the offers: one to Las Vegas, one to Hawaii and so on. If you see an offer you like, you can tap on it and download a coupon onto Apple Passbook or Google Wallet for purchase immediately or later when you get home.

The key to getting motion ads right is simplicity. A good design speaks for itself. Motion ads also offer a clear value exchange. Instead of having you click through to another website where you have to find the product and put it in a shopping cart, motion ads take care of everything for you in as few steps as possible.

Overcoming the Hurdles

As you might expect, creating motion ads comes with its own set of challenges. Ensuring a completely natural and intuitive experience requires some technical heavy-lifting. For instance, the ad needs to respond to user feedback in real-time. This requires sophisticated machine-learning algorithms that can recognize different types of motion and respond appropriately. Another challenge is creating these complex ads on the fly.

Additionally, so that Motion Ads work on iOS, Android and whatever else is out there, they have to be platform-agnostic. They also have to display on any type of a device or a screen size. To accomplish all of this, ads have to be coded in open source languages (CSS, HTML and JavaScript) and presented in WebViews.

As the world transitions to mobile only, marketers need a new type of premium programmatic mobile ad, one that contributes to positive brand experiences. These capabilities are all available today, as we’ve figured out here at Adtile. It’s now up to marketers to start taking advantage of the new opportunities available in mobile to create ads that establish real human connections because they were designed specifically with mobile in mind.

This article was originally published on AdAge →

How smartphone sensors will change mobile advertising

Nils Forsblom
wrote this on

How smartphone sensors will change mobile advertising and open the door to virtual reality

In just five years, smartphones have transformed our lives. It’s hard to imagine how much they will change in the next five years, but one thing is for sure: Publishers and advertisers have just begun to unlock all of the potential of what smartphones can do.

Our phones will become increasingly aware of their surroundings. And they will become platforms for rich media engagements, where we interact with 3D environments through gestures and movement. Believe it or not, much of the technology for making this happen is already sitting inside your phone.

Your phone knows a lot about you

Many people don’t realize how smart their smartphones really are. Modern phones are packed with dozens of tiny sensors, some as thin as paper. Every year those sensors become smaller and more sophisticated.

To begin with, your phone has three accelerometers for sensing gravity and tilt, a gyroscope to determine orientation. It has a compass for sensing direction, and several environmental sensors for measuring ambient air temperature and pressure, illumination, and humidity. There is also a proximity sensor for recognizing when you move your phone up to your face during a call and an ambient light sensor for boosting brightness levels in dark environments. The list goes on.

New sensors—and there are many—invite new possibilities. Apple’s iPhone 6 uses its barometer to track vertical movement. Chemists at MIT have developed a smartphone sensor that detects when food has gone bad. UV light sensors now being tested by ROHM may one day tell you when to wear sunscreen. What about detecting carbon monoxide levels or air quality? All of it is possible with the right sensor.

VR brings new levels of creative engagement

A major field of innovation in smartphones will be VR. The phones of the future might look something like Oculus Rift meets iPhone, only without the clunky visuals. Instead, you’ll have a lightweight VR, practical for more casual everyday use.

Lightweight VR won’t offer the full immersive experience of a headset, but you won’t need that either. What it offers instead is convenience. To use it, you simply extend your phone out in front of you like you are taking a selfie. When you look into the screen, you see another world. And when you move, the 3D image on the screen moves with you.

Lightweight VR has numerous practical applications. You can use it to navigate any type of complex space. Imagine the advantages to using interactive 3D to help you find your way through a confusing airport? A lightweight VR experience might guide you to your departure gate, the baggage claim, or a nearby restaurant if you have a long layover.

Additionally, lightweight VR will play an increasingly important role in how companies market their products. Instead of bombarding consumers with static ads, marketers can use VR to invite customers to engage in a experience. You can explore a vehicle, restaurant or hotel, or visit a faraway resort. VR can tell you volumes more about a place or even an object than a video or a high-res photo can.

Mobile VR comes with difficult challenges, however. To offer a natural, convincing experience, the user interface needs to be completely intuitive. Sensor data and machine learning will play a large part in making that possible. Sophisticated algorithms will enable your smartphone to calculate your precise movements so that the 3D image on your smartphone screen moves with you with a minimal amount of latency.

Over the next few years, mobile VR will likely be developed to a greater extent. Your smartphone will get to know you and your habits like a close friend. Interactions between you and your phone will be smooth, natural and intuitive, and VR will be right there, ushering in a world of new experiences.

This article was originally published on MediaPost →

Sensor-based mobile engagement: Intuitive and delightful

Nils Forsblom
wrote this on

Adtile Mobile VR

Annoying and disruptive. Those are the two words that best describe the mobile advertising experience today. Tiny banner ads you can barely decipher and end up clicking on. Those clicks take you to far-away web pages that take too long to load. And let’s not even talk about those unwelcome videos and sounds that trigger automatically.

The problem is most mobile ads are nothing more than mini-desktop ads. They provide no entertainment value or utility, and they fail to take advantage of the things that make smartphones unique. On the other hand, people love mobile apps. What if mobile ads were more like apps?

Tapping the ‘smart’ in smartphones

One of the things that make smartphones unique is their built-in intelligence. Unlike desktops, smartphones are packed with dozens of tiny sensors — an accelerometer, gyroscope, digital compass, to name a few. These sensors collect all kinds of data on us and the world around us.

By putting that native intelligence to work, it’s possible to create a genuine interactive experience for the user. That’s the concept behind the work at Adtile Technologies. We create native-mobile ads and experiences that engage people in new and delightful ways. Adtile products include Motion Ads and Adtile VR.

Creative that invites the users to play

You can think of Adtile Motion Ad as a tiny app or a game that provides a clear value exchange to the user. You interact with an ad by tilting, shaking, turning, or otherwise playing with it, and you are aptly rewarded with a coupon for a drink, discount, or whatever.

For example, a coffee shop asks your to tilt your phone to fill up a virtual cup with coffee. Another ad encourages you to shake your phone to create your own milkshake. And yet another asks you to press on a virtual button and hold it until 150 users join you in doing the same thing simultaneously. In return, you receive a coupon for a product. The ad also tells you exactly where to go to redeem the coupon: the coffee shop 200 feet away, for instance.

Event sequencing and dynamic visual feedback is the key to these experiences. As you draw a heart in the air with the phone, the heart appears on the screen. As you tilt the phone to fill up a cup with coffee, you see coffee pouring into the cup with zero lag between the movement and the action. The result is a completely natural, intuitive interaction between the ad and the user.

The ads are also non-intrusive, appearing as a natural extension to whatever content a user is viewing. You never feel like an ad is hijacking your phone.

Adtile’s motion-sensing software is built using standard web technologies — HTML, CSS and JavaScript — and delivered as WebViews via Adtile native SDKs, standard mobile tags or MRAID compatible SDK. The Adtile Motion Framework has a very light footprint, only around 15 KB, and comes with a full set of design and developer guidelines, similar to what you would get from Google, Apple or Windows when developing apps.

Constructing your own ad is easy and straightforward. Adtile’s Motion Store provides a vast number of pre-designed and pre-coded consumer experiences to choose from. You simply add your own creative media and storytelling. The responsive ads work on iOS or Android and adjust to any size screen.

Virtual Reality: One step beyond

Mobile VR takes the notion of user engagement a leap further. Some refer to Adtile’s VR framework as lightweight VR because it doesn’t require the awkward goggles as does traditional immersive VR. You simply hold the phone out in front of you, look into the screen, and you see a window into another world.

And you are connected to that world. Adtile VR uses sensor technology to calculate your precise movements in space so the 3D rendering on the smartphone moves with you. Walking, turning or pitching the phone up or down changes the view on the phone with a minuscule amount of latency.

By incorporating the Adtile VR framework into their native or web-based apps, developers can create a world where users can explore places and objects in a new manner. You can literally walk around a car in a showroom or an art piece at a museum, examining the object from every conceivable angle.

Beyond advertising, lightweight VR is useful in helping people find their way around a complex area, such as an airport, shopping mall, sports stadium or even a museum. You can use VR to explore a place you plan on visiting, such as a hotel or resort. Combining mobile VR with iBeacon technology unlocks even greater mapping and navigation potential.

Adtile VR overcomes tough challenges to work on a smartphone. The technology is able to detect movements with a high degree of accuracy and translate those into smooth motion on the screen. This is done with sophisticated algorithms that precisely capture everything from gestures and arm movements to the number of steps a user is taking in real time.

With today’s busy lifestyles, most people spend only a few minutes at a time on mobile apps. Lightweight VR is intended for that sort of casual use, so you don’t have to worry about it draining the battery or generating excessive heat the way immersive VR does. Also, lightweight VR uses minimal bandwidth. The Adtile VR javascript framework is around 450 KB.

The future is mobile-only, not mobile-first. By engaging mobile users through interactive Motion Ads and mobile VR, brands will stand a much better chance of winning over customers’ hearts, instead of annoying and alienating them.

Virtual reality without the geeky headsets

Amy Castor
wrote this on

Virtual reality without the geeky headsets

When most people hear the words “virtual reality,” they think of a headset like Oculus VR or Samsung Gear VR. But while headsets or goggles offer a great immersive experience, they do not scale well beyond gaming or watching movies. Let’s face it, headsets attract too much attention outside of the home. And, as Google Glass taught us, people aren’t that keen to wear computers on their heads. So where does that leave virtual reality?

Lightweight VR for everyday use

Virtual reality can exist on smartphones without the headset. You might call it lightweight VR. And it offers practical real-world advantages, says Nils Forsblom, the CEO of San Diego-based Adtile Technologies. His company is working on a VR framework that lets you enter a virtual reality world simply by looking at the screen on your phone and interact with that world through motion.

Lightweight VR works something like this. You hold the phone up and away from your face and you see a window into another world. You are also a part of that world. Adtile uses a phone’s innate sensors to calculate your precise movements so that the 3D image on your smartphone screen moves with you. Walking, turning left or right, or pointing the phone in a different direction changes the view on the phone with a minuscule amount of latency.

Real-world applications

You won’t get the full immersive experience with lightweight VR you would with a headset, but you won’t need it either. What you get instead is convenience. You can use the technology to navigate any type of complex space. Imagine the advantages to using interactive 3D to help you find your way through a confusing airport? A lightweight VR experience might guide you to your departure gate, the baggage claim, or the nearest restaurant if you happen to have a long layover.

VR can also help you explore places were you are not. If you are planning an upcoming trip to Hawaii, for example, you could use lightweight VR to explore the hotel lobby, check out your room or venture out into the pool area. VR can tell you a lot more about a place or even an object than a simple video or a high res photo can.

Lightweight VR has other, potentially even more interesting applications when you consider combining it with Apple’s iBeacon, a technology that detects how close your smartphone is to a certain location. iBeacon can send out messages to you based on your location, inviting you to explore something in VR — or even putting you on the right path, if you are trying to get somewhere important.

Not without challenges

Of course, developing even a lightweight VR for the smartphone has its challenges. You have to be able to detect movements with a high degree of accuracy and translate those into smooth motion on the screen. This requires sophisticated algorithms able to capture the scale of short, medium- and long-range motions, everything from gestures (tilting, shaking the phone) to arm length movements to how many steps you are taking and how fast in real time.

Immersive VR has a tendency to generate heat and use up battery power in a phone. But as opposed to gaming, where you are using the phone for hours at a time, lightweight VR is intended more for casual use. You are using it for a few minutes at a time to learn about an area or observe an object. Also, lightweight VR doesn’t use up nearly as much bandwidth on your phone. According to Forsblom, the HTML5-based technology he is working on is only 450 KB in file size. Eventually he plans to release the Adtile SDK to app developers who want to deliver their own lightweight VR experiences.

VR is still in its infancy. We are hearing a lot about different headsets and goggles, even cardboard ones like the one from Google, where you slip your phone into the headset. But it seems there is a potential for a type of VR we can all use everyday, without the clunky headsets—a simpler VR that simply helps us get around.

This article was originally published on Venturebeat →

Adtile introduces Mobile VR

Leslie Van Every
wrote this on

Press Release, March 17th, 2015

Adtile Technologies today announced its new mobile software plat­form, Adtile VR, which allows users to experience virtual reality on smartphones without the need for special goggles or any other exter­nal hardware. This new software creates an easy-to-use, smooth and natural experience for real-time motion simulations. By incorporating Adtile VR into native or web-based apps users can walk through a digital space, turning left and right, as well as looking up and down, and with the use of touch controls interact.

“With mobile being the largest computing medium of all times, VR capabilities need to be mainstream and additional hardware makes scaling hard. Since Adtile Mobile VR does not rely on any ancillary equipment, the application possibilities are endless.”

“Adtile VR is a composition of physics, math, computer science and art that goes far beyond advertising. It is new creative medium for smartphones allowing deve­lopers can go truly beyond the edges of the phone for the first time.”

– Nils Forsblom, CEO and founder, Adtile Technologies

There are three distinct ranges of motion for mobile devices—short, medium, and long range. Adtile Technologies developed sophisticated algorithms using smartphone sensor technology to accurately capture the correct scale and sensitivity of the motion. This new breed of tech­nology is called Space Sensing Mobile VR Technology.

Adtile VR’s blend of short-range, arms-length, gestures with medium to long-range motions allow for accurate physical movement in virtual space. Therefore users can have a real connection with surroundings and content on their smartphones. Space Sensing Mobile VR Tech­nology can be used in many different and varied industries including publishing, fashion, real estate, retail, entertainment, automotive, science and more.

Adtile VR

About Adtile Technologies

Located in San Diego, California, Adtile is a pioneer and developer of motion engagement and virtual reality technology for smartphones and tablets. We are working with leading technology companies and Fortune 500 brands. Adtile is on a quest to transform mobile adverti­sing. We believe the best way to revolutionize mobile ads is to create an entirely new design that embraces the needs of mobile users by making it part of the user experience. We’ve created a mobile-first advertising solution from the ground up—challenging assumptions about how ads work and redesigning it for a world on the move. More information is available at:

The future of smartphones

Amy Castor
wrote this on

The MIT researchers' wireless chemical sensor
Above: New sensor can transmit information on hazardous chemicals or food spoilage to a smartphone. Image credit: MIT/Melanie Gonick

A combination of sensors, machine learning and virtual reality

Imagine you walk by an Italian restaurant and your phone knows exactly where you are. It knows you love gnocchi and you even traveled to Milan recently. It offers you not simply a coupon, but an immersive experience where you get to explore the restaurant virtually to see what people are eating and visit the kitchen to see how food is prepared. Tempting?

Over the last decade, smartphones have evolved from simply phones to portable entertainment centers. We use them to text, watch movies, and keep ourselves occupied. Now smartphones are about to evolve further. Sensor data combined with machine learning and virtual reality will usher in new wave of engagement, convenience, and utility. Interestingly enough, much of that technology is sitting inside our phones right now.

Your smartphone is smarter than you think

Most people don’t realize how smart their phones actually are or how much they already know about us. Unlike laptops, modern smart­phones are packed with dozens of tiny sensors that enable them to collect all kinds of data on who we are, what we are doing, and the world around us.

Accelerometers and gyroscopes are the sensors we hear about the most. These have the potential to collect data on us even when we are not actively using the phone. But most smartphones also have an image sensor, touch sensor, proximity sensor, and up to 30 other sensors, including GPS for location.

New sensors are being developed all the time. Each one opens the door to new possibilities. Chemists at MIT recently developed a smartphone sensor that detects when food has gone bad. Imagine using your phone to check if the rotisserie chicken you brought home three days ago is still safe?

Sensors make our phones more aware. But sensors themselves only collect the raw data. Putting that data to use requires machine learning. By searching for patterns in the data, intelligent apps can figure out whether you are tall or short, big or little, and even guess at gender. It may sound spooky at first, but not so when you consider how useful apps will become.

Apps of the future think on their own

The most intelligent apps will use sensor-based data to provide contextual information. We have seen examples of this already in first-generation fitness apps that track how fast and how far you are walking or running. And many apps, such as Opentable, Uber, and Yelp, use GPS as their main component to serve information based on our location.

You may already be familiar with Apple’s iBeacon technology, tiny wireless transmitters used widely by retailers, airports and even the NBL and NFL to deliver finely tuned content to your Smartphone based on your location.

Some apps today are even crowdsourcing sensor data for traffic and weather forecasts. Consider how Google gathers smartphone GPS data, and sends it back to users as accurate route-time estimates. Another company, PressureNet is working to pull barometer readings from smartphones to improve weather and climate predictions.

But tomorrow’s mobile apps will employ sensor information to a far greater extent. Theses apps will pick up on patterns and routines and learn a user’s preferences over time. “Anyone can collect data. Finding an automated way to create the meaning of that data is paramount,” says Nils Forsblom, the founder of Adtile, a company working on new ways to use machine learning and virtual reality for marketing.

Future apps will usher in a new level of convenience. Instead of asking for input, they will anticipate your needs. Your phone might send calls to voice mail if you are driving or switch into Airplane Mode when it senses a plane moving on the tarmac. An app might hear people talking in a conference room and ask, ‘Do you want to record the meeting?’

Virtual reality adds a new level of creative engagement

But what happens when you mix sensor data and machine learning with virtual reality? Mobile devices may one day deliver immersive experiences, bringing inanimate objects to life and letting you do things like walk around a sculpture or explore the latest exhibit at a museum.

“The phones of the future might look something like Oculus VR meets iPhone—without the headset,” says Forsblom. Oculus is a headset that delivers virtual reality to the smartphone, but Forsblom predicts smart­phones will deliver immersive experiences without the headsets.

Advertising may no longer interrupt whatever you are doing or reading, but take the form of an active engagement. You might use your phone as an extension of yourself to walk through a car show­room. If you see something you like, you can use gestures and motions to explore a car in more detail, get more information, or sign up to a test drive of a vehicle.

“In the future, smartphone hardware and software will work in seamless harmony. Future mobile devices will be a mix of invisible apps for utility, entertainment, virtual reality, and gaming. Mobile virtual reality will be the ultimate input-output ‘device’ and creative medium,” says Forsblom.

The next few years will likely see dozens of new apps that use sensors in all kinds of mind-boggling ways. Our smartphones will become more like a personal assistant that understand our preferences, habits, our likes, and dislikes. And virtual reality has the potential to take that one step further, allowing us to explore places and objects without having to get off the sofa—now that’s convenience.

This article was originally published on Venturebeat →