Augmented reality (AR) — the term does not exactly jump off the tongue. But the concepts behind the technology are beginning to change what we think of ourselves, objects and the people in the world that surround us.

I am no expert on AR but over the past few months I have seen enough examples of the way mobile devices change our reality to start wondering if what I am looking at is really what I think it is. With Google Glass people will see a data layer that is not visible to the human eye. Through an iOS or Android device, a person can now use apps to provide a different context for playing games, monitoring environments or tracking one’s brain activity.

I asked people developing technology for the AR world what they see emerging. Here’s what they said:

Vikas Reddy, Co-Founder of Occipital wrote in an email interview that AR has not quite lived up to its potential due to the lacking capability to track and map the real world. But as computer vision algorithms and hardware improve, the camera will become the most important sensor and input mechanism not just for AR but for all computing:

Think about how much visual information each person processes on a daily basis while going about their lives. Almost none of this information is accessible for computation … yet. Today, your smartphone’s computational reach into its surroundings end at its touchscreen surface. To your device, the real world isn’t a canvas of interactivity. Soon, however, computer vision will be used to make real-world environments computationally interactive and fun, thereby extending the computational reach of your device into the visual space around you.

At the Blur Conference, Sphero CEO Paul Berberian gave me a demo of a new game called “Sharky the Beaver,” which TechCrunch’s Romain Dillet wrote about earlier this month. Sharky is essentially a robotic ball that serves as a rolling marker. The user controls the ball through a Bluetooth-enabled device. As the ball rolls across the floor, the user sees Sharky bounce around eating cupcakes. By creating two streams of data, the experience goes between the real world and the virtual one fairly seamlessly.

Sharky is available to developers as an SDK. A likely outcome is a library of avatars that people control via the little, flashing robotic balls. For instance, a furniture company may create a network of avatars that people can use to see how tables and chairs look by rolling the ball around the living room.

I also had the chance to talk at Blur with InteraXon Co-Founder Ariel Gartern about the company’s brain-sensing headband that allows your brainwaves to serve as a way for monitoring concentration levels or as a means for controlling window shades or the lights in a house. Its first in-house app helps with brain fitness for “better attention skills, improving your memory, reducing anxiety, building a more positive attitude and staying motivated.”

Chris Aimone, InteraXon’s CTO told me in an email how this form of technology intersects with AR.

There are a number of excellent ways that brainwaves and AR fit together. There are predominantly two kinds of AR that people refer to: glass style AR, where one wears a pair of glasses and the world is augmented or mediated on the screen; and iPhone-camera type AR, where one holds up an iPhone and new layers are added to a scene. Google glass-style AR provides the opportunity for collecting brainwave data because you have a continuous-wear device that can continuously record brainwave signals. Adding brainwaves to this environment allows you to show real-time activity about you, presented all the time. For example it could continuously register and stream your level of stress throughout the workday. It also allows the computer system to do a better job of presenting contextually aware overlays. It can provide content and augmentations that take into consideration not just information informed by place or visual input, but also the context of the user. Many of these systems are “context aware,” adding the context and state of the user, thus informing what kind of information is presented and in what way it will be presented. For example, are you sleepy and therefore want information about hotels in the area? Are you cognitively mazed, so only pertinent info should be presented? Brainwaves in an AR system also allow for real-time neuro feedback. This would allow you to know your brain state and have the opportunity to optimize it — being able to choose and be guided into the desired state as you go about your day.

But what is the future of augmented reality? Cyborg Anthropologist Amber Case and Co-Founder of Geoloqi, said augmented reality will become interesting when the barriers to creating custom objects, animations, apps and experiences is drastically lowered. Similar to Flash or the App Store, AR becomes interesting when these experiences become very personal or shared between friends.

She added: