Light Painting Enters The 21st Century

Nils Forsblom
wrote this on

Image: Gjon Mili/The Life Picture Collection/Getty Images

Is it possible to reimagine Picasso and Mili’s work with a smartphone? Light painting enters the 21st Century.

The year is 1949. In a darkened room in Vallauris, South of France, Pablo Picasso is working with a small electric light. The 68-year-old artist’s moves are swift and athletic as he draws lines, curves and shapes on a canvas of thin air. Those images of pure light were captured by the photographer Gjon Mili for Life Magazine.

Mili, an engineer by trade, made a name for himself with light paintings and photoflash photography. Previously, he had attached tiny flashlights to the boots of an ice-skater and recorded trailing lines of light as she waltzed through the air. He showed that work to Picasso who agreed to an experiment of combining art with Mili’s technical expertise.

So fascinated was Picasso by the results of their initial work together he agreed to five further sessions with Mili. The photos generated by those sessions equated to 30 images, all preserved by the camera. Featured were familiar Picasso motifs: bulls, centaurs and Greek profiles.

Capturing light and movement through art was nothing new. It has been repeated in various forms since 1889, but Picasso’s pictures, which appeared in Life, were iconic and transformative. (You can read more about the history of light painting photography here.)

At the time, Mili was using cutting edge equipment and technique to construct his photos. His work of capturing motion on film was a combination of high-tech, skill and finely honed talent.

Image: Gjon Mili/The Life Picture Collection/Getty Images

Today, you do not have to be a Mili to create these types of images. Adtile Technologies, a company known for its sensor-enabled Motion Ads, has made it possible to capture motion with a smartphone so that anyone of any skill can create these highly technical images.

Introducing the Air Pencil

Today, we’re introducing something very cool that has never been done before—we call it the Air Pencil. Air Pencil (now in beta) that lets anyone capture freeform movement in space using their mobile device. How does it work? Think of your phone as Picasso’s electric light. Move, swing, glide the smartphone through the air and your motions will be captured, not on camera, but on screen.

Additionally, while Picasso and Mili’s light paintings were in 2D, Air Pencil lets you capture motion in 3D, as beautiful lines, curves and shapes you can literally move through and explore in ways never imagined possible in the time of Picasso.

Air Pencil is intuitively easy to use. It is a lightweight app that runs on a mobile web browser. Since it is a web app, the majority of the code powering it runs on a remote server. All you need to get started is a URL.

Air Pencil — a whole new creative freedom.

When you initiate the app, you’ll see a screen with a three-axis helper and a small red position indicator at the origin. On the bottom left and right of this screen are joysticks for panning and zooming. These items are all within the web-browser’s viewport, which you can think of as your virtual camera in the 3D space.

Sketch abstractions. Out of thin air.

To begin recording your movements, simply press down on any point in the the screen and move your smartphone through the air. Release the screen to stop or pause your recording. Want to share your work? You can easily send a recorded 3D space file to anyone as a link in a text message or put it on social media.

Behind-the-scenes technology

As you can imagine, creating an app like Air Pencil requires serious technical engineering. You have to be able to capture three-dimensional motion with a high degree of accuracy, whether the user is swinging the phone in a large elaborate swirl or drawing a tiny circle.

To do this, the Air Pencil taps into a smartphone’s native micro-electro-mechanical systems (MEMS) — namely the three-axis magnetometer, three-axis accelerometer and three-axis gyroscope. It then calls on sophisticated machine algorithms to reliably infer the precise movements of the user based on sensor data.

Adtile Air Pencil

Where would you use such an application? The answer: education, art, and collaboration. For example, you might use Air Pencil in a physics class to teach students about flight dynamics, or how objects move through space. As a collaboration tool, several users can draw independently with different colors and then combine those images on screen. Finally, the application has huge potential in the art world as the images you create can be shown on any size screen.

An art-inspired technology

Picasso and Mili were the inspiration for Air Pencil. Picasso, the father of modern art, liked to experiment with a plethora of media. And Mili brought a high tech element into that instinctive world or art.

The truth is, I wanted to go back in time and recreate the flashlight and camera technique with a phone and see what people would do with it. What kind of art will they create?

At a basic, intrinsic level, art inspires technical innovation. The two are inextricably combined. In fact, I got the inspiration for Adtile in 2013 when I was visiting the Alexander Calder exhibit at the LACMA. At the exhibit, quotes from the sculptor Calder lined the walls: “Just as one can compose colors, or forms, so one can compose motions.”

That was my aha moment. That quote influenced me to start working on sensory-enabled motions for mobile devices and turn Adtile into a motion technology company.

Today I rely on artists daily to benchmark all of our products at Adtile. Our work is driven by Picasso, Calder, James Turrell, Mark Rothko, Anish Kapoor, Andy Warhol, Julian Schnabel and more. The work of these artists are an endless pool for innovation.

When you get down to it, art is about taking something technologically or emotionally complex and turning it into something simple, functional and beautiful. It’s also about making people smile. Or as Calder said, “Above all be happy.” And that is what I hope to do with Air Pencil, make people smile.

Were You Aware of All These Sensors In Your Smartphone?

Nils Forsblom
wrote this on

Image: © 2015 Adtile Technologies

Smartphones have gone through an incredible evolution in the last decade. We are moving to an era where our smartphones are becoming more like personal assistants, monitoring our behavior, tracking our movements and anticipating our needs. A large part of this evolution is enabled by sensor technology.

Sensors bring intelligence and awareness to our smartphones. Today’s mobile devices are packed with nearly 14 sensors that produce raw data on motion, location and the environment around us. This is made possible by the use of micro-electromechanical systems (MEMS). MEMS are mechanical systems built into tiny semiconductor chips.

Let’s take a look at some of the major sensors in the typical smartphone.

Magnetometer and GPS

Your smartphone comes equipped with a magnetometer, otherwise known as a compass. With its ability to sense magnetic fields, this MEMS device detects compass heading relative to the Earth’s magnetic north pole. In conjunction with GPS, it determines your phone’s location. GPS is another type of sensor in your mobile device. It relies on satellites to determine location. Originally developed for the military, GPS was made available for everyone in the 1980s.


A three-axis gyroscope determines if your device is twisted in any direction. Using rotational force it measures angular velocity around three axes. The absolute orientation of your phone, represented as the angles yaw, pitch, and roll, is detected by a combination of the accelerometer, compass, and gyroscope.


A three-axis accelerometer in your smartphone reports on how fast your phone is moving in any given linear direction. The accelerometer has the ability to detect gravity as a static acceleration as well as dynamic acceleration applied to the phone. There are various types of MEMS accelerometer hardware available, such as microscopic piezoelectric crystals that change voltage under stress when vibrations occur, or differential capacitance caused by the movement of a silicon structure. The magnetometer, GPS, gyroscope and accelerometer on your phone all work together to create the perfect navigation system.

Proximity sensor

Comprised of an infrared LED and an IR light detector, a proximity sensor detects how close the phone is to an outside object, such as your ear. This sensing is done to reduce display power consumption while you’re on a call by turning off the LCD backlight. It also disables the touch screen to avoid inadvertent touches by the cheek.


More advanced smartphones have a chip that can detect atmospheric pressure. But to use it, the phone needs to pull down local weather data for a baseline figure on barometric pressure. What’s more, conditions inside a building, such as heating or air-conditioning flows, can affect the sensor’s accuracy. Barometers are best used in combination with other tools, including GPS, Wi-Fi and beacons.

Other sensors

Your smartphone also has an ambient light sensor to adjust brightness levels in dark environments. A fingerprint sensor can enable secure device and website authentication as well as mobile payment. Add to that list, microphone and camera sensors. Samsung’s Galaxy S6 even has an integrated heart rate monitor.

Sensors raise the consciousness of our smartphones. With mobile sensors becoming smaller and more sophisticated—and new types of sensors coming onto market—what we’re seeing today is only the beginning in the era of self-aware devices. More is waiting around the corner.

Machine Learning And The Future of Mobile Devices

Nils Forsblom
wrote this on

Machine Learning

Machine learning will play a potent role in the future of mobile devices

As we talked about in a previous blog post, your smartphone is replete with dozens of sensors that collect all kinds of information on three dimensional device movement, positioning and the outside environment. But most of the data those sensors collect comes in a raw form. It has no practical meaning on its own. And that is where machine learning steps in.

What exactly is machine learning? The field is immense with lots of different categories and subdivisions. But let’s start with a common, layman’s definition: Machine learning is a discipline of artificial intelligence that focuses on the development of algorithms that learn from and make decisions based on data.

Or, as machine learning pioneer Arthur Samuel defined it, machine learning is a “field of study that gives computers the ability to learn without being explicitly programmed.”

You may not be aware of it, but almost everything that happens online is driven by a type of machine learning algorithm. When you do a search, machine learning chooses the results you get. Amazon uses machine learning to recommend products. Netflix uses it to recommend movies. And Facebook and Twitter use it to choose which posts to show you.

Additionally, both Google Now and Siri rely on machine learning to recognize speech input and respond quickly to user commands. And facial recognition software Affectiva also uses machine learning.

Machine learning plays a big part in how intelligent our mobile devices are and how they interact with us. We believe machine learning will play an even bigger role in the future of mobile devices — and that is some of what we are working on at Adtile.

As machine learning algorithms become increasingly sophisticated, they will change how our mobile devices interact with us — our mobile devices will recognize gesture, motions and movement to higher degrees of accuracy and respond to our needs in ways we never imagined possible.

Machine learning algorithms have the potential to give personality to our mobile apps and devices. In future blog posts, we’ll talk more about how machine learning and sensors are working together to improve people’s lives, their health and their mobile experiences.

How Adtile Motion Store differs from HTML5 ad builders

Nils Forsblom
wrote this on

Adtile Motion Store

I get a lot of questions about how Adtile Motion Store differs from HTML5 ad builders, so I thought I’d address those in a blog post.

If you’re not already familiar, an HTML5 ad builder is for creating display or rich media cross-screen ads — the static, animated, and video ads you see in mobile apps and Web browsing. You assemble the ad by dragging and dropping simple shapes, imported media, and HTML widgets and gizmos onto a blank canvas.

Now let’s address the Motion Store, why it’s a completely different animal, and why sensory Motion Ads are the future of mobile advertising.

Tapping into sensors

HTML5 ad builders are great for building standard rich media ads that run across screens, on desktop and mobile. But if you want to create a sensor-enabled ad specifically built for the mobile environment, look to the Motion Store.

The Motion Store is a do-it-yourself platform that makes it possible for anyone, regardless of their programming ability or design knowledge, to build an elegant, sophisticated Motion Ad.

A Motion Ad behaves more like a game or mini app than an ad. It does so by leveraging the various motion and location sensors within a mobile device to accurately detect a user’s motions.

When the user moves, whether to push, pull, rotate, twist, shake or tilt his or her smartphone or tablet, the ad responds in real-time, creating a coordinated seamless interaction that is completely intuitive to the user.

Adtile templates ensure high quality UX

With an HTML5 ad builder, you drag and drop objects. The Motion Store, on the other hand, brings an ‘app-store’ style approach to ad building. Only instead of selecting an app, you’re selecting a sensory Motion Ad template.

So far we have over 200 templates in our Motion Store, and we are adding new ones all the time. Each template is a fully coded, fully designed experience, one built on the Adtile Motion Framework that follows our proprietary design language.

You can customize a template with your own creative and storytelling to create a brand experience unlike anything else that’s out there on mobile.

Adtile templates offer several advantages. They save time by eliminating the countless hours of iteration and experimentation typically associated with complex ad building. We only provide proven templates that we have tested live multiple times and they capture strict design constraints to ensure the highest quality user experiences.

The numbers speak for themselves. Adtile Motion Ads generate on average over 30 percent engagement rates, more than 24 second participation rates, and click-through rates of more than 6 percent.

Motion Ad with tilt up and down detection

Built for a fast-paced programmatic world

HTML5 ad builders are useful for creating simple rich media ads but if you want to add device specific experiences, like animation, you quickly get into custom scripts. And, because HTML5 ad builders aren’t supported by a sensory framework or design guides, adding any type of even primitive motion detection abilities requires extensive manual coding with poor results.

Due to this, building a complex experience with an HTML5 ad builder can take weeks of back and forth and testing, driving up costs.

In contrast, the Motion Store lets anyone create a sophisticated Motion Ad within minutes. Once you’ve finalized your design, the platform generates a responsive HTML5 build for iOS and Android. You can request standard mobile Web or MRAID tags for your ad server, and analytics are baked right in.

True native experience

While HTML5 ad builders let you create rich media ads, Motion Ads are truly native.

Native ads blend in with the organic experience of the app or Web page they appear on but Motion Ads are also built on a native mobile framework.

The Adtile Motion framework takes advantage of the innate sensor technology found in all smartphones and tablets. It incorporates platform agnostic native code and application agnostic design and it delivers a responsive storytelling experience for both iOS and Android.

One or two of those elements alone would contribute towards generating a successful cross-screen ad. But the Adtile framework brings together all four to create a delightful and highly personalized experience for the user.

Solving the entire problem

If you think about it, developing a high quality Motion Ad from the ground up (without our MotionStack), would require a high level of technical expertise. You’d need in-depth knowledge of advanced physics, math, mobile hardware, sensory pretext, sensor fusion, sensor filter techniques, machine learning, interaction design, JavaScript, HTML, and CSS.

You’d also have to grasp iOS and the vast Android device and browser landscape. You’d have to hire a unique mix of software engineers, product people, technologists, designers and mathematicians for the undertaking. And even that won’t guarantee a workable solution.

Fortunately, the Motion Store solves the entire problem for you. For the first time ever, building sensor-enabled mobile creative is seamless, simple to execute, and completely self-service.

Keeping it simple

The reason why Motion Store works and Motion Ads are so successful is due to their simplicity. We strip everything down to create a minimalistic, straightforward approach that makes sense to both the brand creating the ad and the consumer engaging with it.

We realize you can’t remove everything from the brand experience and some Motion Ad designs are inherently more complex than others but we strive always to follow the advice of Albert Einstein who said, “Make everything as simple as possible but no simpler.”

If you haven’t already, we encourage you to check out our Motion Store and find out why simple makes so much sense.

Introducing the Adtile Motion Store

Nils Forsblom
wrote this on

Introducing the Adtile Motion Store

Movements, gestures, motions and the sensory hardware. That’s the future of programmatic mobile creative, and we want to democratize it. That’s why we created the Adtile Motion Store, a way for virtually anyone to build sophisticated Motion Ads designed specifically for a world on the move.

Captivating Experience to Complement the Hardware

Elegant sensor-enabled Motion Ads are changing the way people think about mobile advertising. No longer is the consumer a trapped, passive observer. Instead, by twisting, tilting, shaking or otherwise playing with the ad—which is more like a tiny app—the consumer becomes part of the creative and device agnostic experience.

Quantity meets Quality

Motion Store makes it possible for anyone to build these delightful yet sophisticated ads on the fly. Modeled after Apple’s App Store, where you go to buy apps, the Motion Store is where you go to choose from hundreds of pre-coded Motion Ad experiences (we are adding more all the time). We’ve designed these templates to include the perfect pairings of design and motion. In all cases, the result is a seamless, natural and engaging end-user experience.

Typically, building a complex rich media mobile ad from scratch can take weeks of back and forth and can cost upwards of tens of thousands of dollars. But the Motion Store allows you to create an ad in literally minutes. No coding is required and the templates eliminate the endless iterations and experimentation so often associated with all rich media mobile ad development.

Creating an ad in the Motion Store is a straightforward process. Select the user experience template you want. Now add your own creative and storytelling. Some customization is possible. You can edit images, text, fonts, transitions, spacing, links and even add video and sound files. When you’re done, you can test out the ad on your phone or tablet.

Introducing the Adtile Motion Store

Once you are satisfied with your ad, Motion Store will automatically generate a responsive HTML5 build for iOS and Android that works on any mobile browser or screen size. You can also request standard mobile web or MRAID tags for your ad server.

Analytics are baked right in. Through your account dashboard, you can view details of your campaign along with a variety of real-time performance metrics, such as unique views, actions, engagement time, engagement rates, and much more. At anytime during your campaign, you can tweak your ad, making adjustments along the way.

How does the Motion Store compare to an HTML5 ad builder? We get that question all the time. The answer: Two completely different animals. An HTML5 ad builder is a simple editing tool that allows you to create ads via drag-and-drop rich media widgets and features. You have to come up with your own custom user experience and design, and adding any type of sensory element requires extensive customization.

On the other hand, the Motion Store is all about ready-made experiences that are based on Adtile’s design guidelines. What’s more, Motion Store ad templates are built on Adtile’s proprietary Motion Technology, which uses sophisticated algorithms to interpret a mobile device’s sensory data and respond with precise dynamic visual feedback.

Check out the Motion Store yourself. If you are interested, contact us and we’ll send you back instructions on how to get started.

Start developing for Motion Store

Mobile DSP? Mobile ad network? Publisher? Do you want to deep integrate your service with Motion Store? Get in touch. We’ll help guide you in the right direction and provide you with additional resources you may need along the way.