The future of interaction is touch-free with ultrasound tech

Elliptic_Labs

More and more devices seem to use gesture tracking to allow touch-free interaction such as the Leap Motion and Kinect but also the recent and innovative onecue (see related article).

However, Elliptic Labs have introduced a new tool called the “Multi Layer Interaction” (MLI). It is not a new device but rather a new technology that can be implemented in already existing devices like your smartphone.

Tracking the position and distance of your hand, the device can ‘wake up’ when it senses movement, and execute other commands depending on the distance threshold the hand reaches. It can then improve the user’s interaction and experience with their smartphone.

On the technical side, it uses a minute transducer and a small ultrasound speaker to detect the movements around and it can be easily adapted to computers, smartphones, tablets and wearables. It can also use the built-in microphone to echo-locate the user’s hand so most manufacturers simply need to add the ultrasound speakers and the software to their device.

One of the most exciting thing is that it can detect the location of a hand with a 180 degrees interaction space with a maximum range of 50cm while using a small amount of power.

Watch the video below to learn more:

If you’re interested in playing around with it, you can get a dev kit on the website, Elliptic Labs.

Advertisements

A tradition of failing live demos

wwcsyd_charliegerard

Last night I presented at the Women Who Code meetup in Sydney one of the project I have been working on on my personal time involving a Sphero robotic ball, a Parrot AR Drone and the Myo armband.

As this wasn’t my first talk, I learnt to double check that everything was working before even going to the meetup. I was pretty confident this time because I knew all my code was still working. After talking quickly about how I managed to connect all the devices together and control the Sphero and Drone using the Myo armband, I moved on to the live demo, and.. of course… it failed.

I wanted to start by showing the Sphero and for some reason, this one did not want to connect to my computer. It sometimes takes a few seconds because it connects via bluetooth, but this time, it just didn’t seem to be working fast enough. As I had just 5min to do my presentation, I just gave up and moved on to showing the drone. This one worked really well and I was able to show how I applied some specific gestures to some drone commands. I had to disable the actual directions because we were indoors and I did not want any accident to happen but it was still taking off when I was executing the “fingers spread” movement, and landing when I was doing the “fist”.

I still need to work on the code to make sure the directions are working fine and maybe make it do some more fancy stuff but for now I’m pretty happy it’s working.

I will demo this project again in January at the SydJS meetup and hopefully the demo gods will be fully with me this time.

Hello World

Hello <insert name here>!

I’ve been thinking about creating a blog for quite a while now but I kinda always gave up cause I thought I wouldn’t have enough to write about or I just wouldn’t have the time or the energy to update it often.

However, I’d like this one to be different.

Since I started a web development immersive course, I learnt a lot in class and by myself and I’d love to share this with whoever likes it (this should be you?), especially because I used a lot of online resources to help me through this process and giving back the knowledge I acquired would be the best thing.

One last thing… Be aware this could be another failure.