A Brain Sensor That Could Detect Strokes Before They Happen…

C-Lab-Brain-Sensor-Detect-Strokes-Samsung

A few years ago, a team of engineers at Samsung started working on a project to detect strokes by analysing brain waves.

Considered pretty much impossible at the time, they now built a real prototype called the EDSAP for Early Detection Sensor & Algorithm Package. This device allows everyone who has a smartphone or a tablet to monitor the electrical impulses in their brain and therefore give them the chance to detect an oncoming stroke and see their doctor before it happens.

With 15 million people suffering from stroke every year, and 66% of them resulting in death or permanent physical disabilities, this device could be really helpful.

How does it work?

The headset is composed of sensors that wirelessly transmit data to a mobile app where the algorithm calculates the risk of stroke all in 60s. Besides, the data collected can also be analysed to inform the user about stress level, anxiety and sleep patterns.

How does it defer to any other brain sensor?

  • Unlike other brain sensors on the market, the EDSAP is more focused on health-related matters rather than controlling other devices.
  • It is also a lot faster than the normal time required from the material used in hospitals (60s instead of 15min).
  • The highly conductive rubber-like material invented by the team allows the headset to scan brain waves in a larger amount and more comprehensively.
  • It is easier and more comfortable to wear as the Saline usually required does not need to be rubbed in the hair anymore.
  • Last but not least, the rubber-liked material invented allows the creation of multiple smaller devices that could be implemented in people’s everyday life. For example, for a longer use, the sensors could be added to hairpins or eye glasses.

Based on the analysis of brain waves from patients suffering from stroke combined with Artificial Intelligence, this project, once ready, could also be applied to other neurological health issues such as depression.

Advertisements

Control the Parrot ARDrone with the Leap Motion in Cylon.js

cylon-drone-leapmotion

Following my tutorial on controlling the Sphero using the Leap Motion, I thought I would keep on converting my Node.js projects to Cylon.js and work on controlling the Drone with the Leap Motion.

If you’ve had a look at my last tutorial, you probably noticed that using Cylon.js makes it really easy to program for hardware and connect multiple devices together.

Below is the usual setup of any Cylon project:


var Cylon = require('cylon');

Cylon.robot({
  connections:{
    leapmotion: {adaptor: 'leapmotion'},
    ardrone: {adaptor: 'ardrone', port: '192.168.1.1'}
  },

  devices: {
    leapmotion: {driver: 'leapmotion', connection: 'leapmotion'},
    drone: {driver: 'ardrone', connection: 'ardrone'}
  },
 

As you can see, you simply need to specify which devices you are using, the more interesting bit comes in the rest of the code…

work: function(my){
   my.leapmotion.on('hand', function(hand){
     my.drone.takeoff();
     after((5).seconds(), function(){
       my.drone.land();
     })
   })
 }
}).start();

This code only makes the Drone take off when the Leap Motion senses a hand over it and land after 5 seconds (just in case it decides to go crazy…).

Then, if you want to make it do more interesting things, you will have to play around with what the Leap Motion has to offer; different types of gestures, distance, hands, fingers, etc… The Drone actions themselves are pretty straightforward:

  • my.drone.up();
  • my.drone.down();
  • my.drone.forward();
  • my.drone.back();
  • my.drone.left();
  • my.drone.right();

You can also make the drone rotate clockwise or counterclockwise but what I found the most awesome thing is that the cylon-ardrone module makes the ‘flip’ movement really easy to execute. On a ‘keyTap’ for example, your drone could do a backflip!!

The code for that would look like this:

 work: function(my){
   my.leapmotion.on('gesture', function(gesture){
     if(gesture){
       my.drone.takeoff();
       if(gesture.type === 'keyTap'){
         my.drone.backFlip();
         after((6).seconds(), function(){
          my.drone.land();
         }
       }
     } else {
       my.drone.stop();
     };
   };
 }
}).start();

If you wanna see the difference with Node.js, you can find my original Github repo here, otherwise here is the repo with more commands!

If you have any question, don’t hesitate!

Wireless Brain-Computer Interface

BCI

I am fascinated about brain-computer interfaces and all the research around using computers to try to recreate some brain functionalities.

In the United States, a team at the Brown University and BlackRock Microsystems  have been working on a new interface that would work wirelessly and created a device that could fit in the palm of your hand…

This device could help paralyzed people take control of devices using their own thoughts without using long wires to connect their brain to signal processors.

You can learn more about this by reading the article by the MIT Technology Review here.

Controlling the Sphero using the Leap Motion in Cylon.js

cylonjs-sphero-leap-motion

In my personal time, I love to play around with hardware and robots.

I started in Node.js but recently I discovered Cylon.js and after a quick play around with it, I found it pretty awesome and decided to rewrite my projects using this framework.

As a starting point, I decided to rewrite the project to control the Sphero with the Leap Motion.

You can find the original repo here, but here are a few code snippets:

Screen Shot 2015-01-10 at 3.02.47 pm

Screen Shot 2015-01-10 at 3.04.09 pm

The way it works is pretty straight forward. The Sphero connects via bluetooth and the Leap Motion needs to be plugged in your computer. Once the Sphero is detected, the hand is tracked by the Leap Motion and the direction will be applied to the Sphero.

Feel free to have a better look at the code on Github.

Now, let’s move on to Cylon.js. The first thing I noticed about this framework is the short amount of code necessary to get to the same result. I managed to do pretty much the exact same thing in 68 lines of code!

I guess what makes it easier is that Cylon already has some modules you can install to program for certain devices, like the ones below:

cylon-devices

To start using Cylon, you need to require it and specify which devices you are working with.

```
var Cylon = require('cylon');

Cylon.robot({
  connections: {
    leapmotion: {adaptor: 'leapmotion'},
    sphero: {adaptor: 'sphero', port: '/dev/rfcomm0'}
  },

  devices: {
    leapmotion: {driver: 'leapmotion', connection: 'leapmotion'},
    sphero: {driver: 'sphero', connection: 'sphero'}
  },

  work: function(f){
  }
}).start();
 ``` 

At the moment, this code is not really doing anything but you can see how to specify which devices you are going to use.

You have to specify a port for the Sphero because it connects to your computer via Bluetooth. To find the port for your own Sphero, run ‘ls /dev/tty.Sphero*’ in your console and replace the port in this code with the result you get.

The rest of the code goes inside the ‘work’ function as below:

```
work: function(my){
  my.leapmotion.on('frame', function(frame){
   if(frame.valid && frame.gestures.length > 0{
     my.sphero.roll(70,0,1);
   }
  }
}

 ``` 

The code above makes the Sphero go forward if the Leap Motion detects any kind of gesture.

For the full code, have a look at my github repo.

I’ll probably write another tutorial soon once I have a chance to rewrite another project but in the meantime let me know if you have any question!

Intel’s Tiny New Computer Could Mean Better Wearable Gadgets

intel_curie_chip

As part of the CES2015, Intel introduced a new button size computer called Curie.

This new tiny chip proves that Intel sees wearables as one of the most exciting thing in consumer electronics and that hardware tinkerers are on the lookout for smaller components to build smarter wearable gadgets.

This new device will include Bluetooth low-energy, motion sensors, and components capable of identifying quickly and more accurately different types of physical activity.

The world’s largest chip maker has also already partnered with some companies in fashion and accessories including Luxottica Group which is the largest eyeglass maker for famous brands such as Ray-Ban and Oakley. Brian Krzanich, Intel’s CEO, declared that Luxottica will use Curie to build real “consumer-friendly” smart glasses.

Available in the second half of the year, this tiny computer will certainly change wearables for the best.

More info here.

The future of interaction is touch-free with ultrasound tech

Elliptic_Labs

More and more devices seem to use gesture tracking to allow touch-free interaction such as the Leap Motion and Kinect but also the recent and innovative onecue (see related article).

However, Elliptic Labs have introduced a new tool called the “Multi Layer Interaction” (MLI). It is not a new device but rather a new technology that can be implemented in already existing devices like your smartphone.

Tracking the position and distance of your hand, the device can ‘wake up’ when it senses movement, and execute other commands depending on the distance threshold the hand reaches. It can then improve the user’s interaction and experience with their smartphone.

On the technical side, it uses a minute transducer and a small ultrasound speaker to detect the movements around and it can be easily adapted to computers, smartphones, tablets and wearables. It can also use the built-in microphone to echo-locate the user’s hand so most manufacturers simply need to add the ultrasound speakers and the software to their device.

One of the most exciting thing is that it can detect the location of a hand with a 180 degrees interaction space with a maximum range of 50cm while using a small amount of power.

Watch the video below to learn more:

If you’re interested in playing around with it, you can get a dev kit on the website, Elliptic Labs.

‘Snake Monster’: A Spiderlike Robot Responding To Stimuli In Its Environment

snakebot

I found spiderlike robots fascinating. So fascinating that I am considering building my own. The way they mimic real movements is really impressive and complex to execute.

The way each ‘leg’ moves has to be done considering the position of the other ones. The balance has to be calculated so the robot can actually walk.

Besides this, more and more robots are able to adapt to their environment and any obstacle that may come in front of them.

One example is the ‘Snake Monster’, a robot developed by the Biorobotics Lab at the Carnegie Mellon University, using 6 snakelike legs.

This robot funded by DARPA, is able to respond to different obstacles like being kicked, or having to walk over a bunch of objects placed on the floor.

Watch the video below to see the robot in action:

At the moment, this kind of robots are more experimental or at a prototype stage but I can’t wait to see them being developed to help rescue victims of earthquakes for instance.