We live in exciting times. You can, today, right now, actually buy in real life the kind of sci-fi gear you only expect to find in a video game. Controlling things — anything but your own body, really — with just your thoughts is a fantasy and sci-fi staple, whether it be Darth Vader force-choking some mouthy Imperial admiral or Professor Xavier commanding the X-Men from the comfort of his wheelchair, finger pressed to temple in grim concentration. But that fantasy is now reality, and you can control things (well, a thing) with just the power of your mind. Think it and your computer does it, no more pesky mice or keyboards or even voice commands.

Or so the people at Emotiv would have us believe. Being the kind of guy who absolutely cannot wait for the day when, as Jonathan Coulton sings, “the things that make me weak and strange get engineered away,” I pre-orded Emotiv’s thought-controlled EPOC mind-computer interface device as soon as I thought I had $299 to spare. As it turned out, being the first on your block with the new toy comes with a real risk: you might end up buying a frustrating experimental prototype for the cost of five new video games.

Gallery: Review: Emotiv EPOC | 8 Photos

< ?XML:NAMESPACE PREFIX = "[default] http://www.w3.org/2000/svg" NS = "http://www.w3.org/2000/svg" />8

There’s no denying the unique thrill of watching something move on my computer screen just because I thought about it. I giggled the first few times it happened. My friends who were over when I un-boxed and hooked up my EPOC on day one kept saying stuff like, “Wow, that’s so freaking weird. And cool.” It really does feel a little bit like magic, although less so because you’re wearing a web of 16 saline-soaked sensors clamped down onto your skull by a plastic headpiece that connects wirelessly to your PC via a USB dongle. So really you feel like a cyborg, which is just as good as magic in my book.

The EPOC is a delicate feeling device. Before slipping it onto your noggin, you’ve got to individually dampen the 16 sensor pads in their case (but not too wet, the instructions warn). Then you have to slot them into the 16 holes in the headpiece and try to slide that into place without any of them falling out. I can now do this pretty well, but the first half-dozen or so attempts regularly sent five or six of the sensor pads cascading onto my shoulders and rolling away around the room. When you’re done with your mind controlling you have to carefully remove the EPOC to avoid a similar mess and then slot the 16 sensors back into their storage case. All said, that’s five to 10 minutes of set-up and clean up every time you use it, including time spent adjusting the fit to make sure all the sensors are where they’re supposed to be and reading your mind five-by-five. It is definitely not a pick up and play controller.

Then you have to train it. The EPOC has three ways of sensing your intent. It can monitor facial expressions (although not with the camera, but with the brain function sensors), allowing you to map winks and smiles to specific keyboard commands. It also has a gyroscope in the headpiece so you can move the mouse cursor by moving your head. But the star of the show is the brain sensing, which you train with the help of a 3D orange box floating in virtual space on your monitor. The EPOC software allows this cube to move up, down, forward, back, left, and right as well as spinning in different directions. You have to train the device in each of these motions, choosing a thought (Up! Up! Up!) and visualizing the cube moving over and over again while the EPOC studies your brain activity. The instructions suggest you might want to make a hand motion as well, but it didn’t seem to make much difference in my training regimen. You can also have the cube animate the action you’re trying to train, which helps you visualize what it is you’re thinking so hard about.

When that cube slides to the right or recedes into the background with nothing more than a flashing thought, it’s amazing. When you stare at your computer and try and think the same thought over and over again and nothing happens, it’s as frustrating as can be. We develop skills in part by learning from our mistakes. The core problem with training and using the EPOC is that it’s nearly impossible to know what you’re doing wrong when all you’re doing is thinking. I mean, I think I’m thinking what I’m supposed to think, but nothing happens. I’ll spam out “Right” over and over again with my mind and the cube just sits there.

First there’s Emotipong, which is just regular old pong, but with your mind.

Then I glance to the right to select a different movement to train and the cube flies in that direction. OK, maybe I shouldn’t think so hard. I clear my mind and try to just think “Right” for a second, sort of in passing. That doesn’t work. Then it does. But then it doesn’t again. My friends who also tried (the software smartly lets you set up multiple user profiles) all had similar results. Clearly we weren’t doing it right, but there was no way of knowing what we were doing wrong.

Emotiv offers software called The Cortex Arcade for free from its web site. At launch it included three simple games meant to be played with your EPOC. First there’s Emotipong, which is just regular old Pong, but with your mind. Move the white bar up and down to deflect the white square back at the other white bar. There’s even a slow mode for us newbie telepaths. The Arcade has a display for what it thinks you’re thinking, and while I thought long and hard about that bar moving down to intercept that square, it thought I thought “Left,” which isn’t a direction that paddle moves. Playing a slow motion game of Pong that’s harder and more frustrating than Ninja Gaiden on its toughest setting is not a happy experience. Especially when, for the briefest of flashes, you see that an errant thought can send the white line moving up and down at a good speed.

Cerebral Constructor is a fancy name for mind-controlled Tetris, and I had no better luck controlling those falling geometric chunks with any reliability. Finally there’s a Jedi Mind Trainer game, which I can’t imagine Lucasfilm approved in any way. Here Yoda wants you to levitate a cutout of an X-Wing as long as you can, which for me was never more than a few short stuttering hops.

With all that promise, all those allusions to nerd icons like the Jedi, it’s hard not to feel a little taken advantage of.

The EPOC has now proven something I’ve long known in my heart but had no proof of — I’m no Jedi. Nor am I a Sith Lord or a Mutant Telepath. I’m a guy who paid $300 for a toy that doesn’t work very well.

You can map your EPOC thought-commands to keyboard inputs, so theoretically, coupled with the head-movement cursor control to replace your mouse, this could be a controller you’d play other PC games with it. I cannot imagine how that would work in even the simplest game, given that Pong is arguably the simplest game out there. Likewise, I don’t see anyone using it to send instant messages, compose e-mails or browse through their photo galleries (Emotiv does offer a gallery interface application). Since I couldn’t get it to work with the applications designed to use it, I did not try to control a game like Torchlight or Plants vs. Zombies with my mind.

The EPOC is not a mass market device for people looking for a turnkey telekinesis solution. It’s an expensive toy for people to experiment with, or a cheap device for scientists to do research with. It’s fun to show off to your friends but probably not something anyone will want to play with for very long. The signs were there from the beginning that the EPOC might not live up to its hype, but with full page ads in places like the pro-transhumanist magazine H+ and promises of sci-fi technology in the here and now, Emotiv clearly sees geeky gadget enthusiasts like me as its target audience. With all that promise, all those allusions to nerd icons like the Jedi, it’s hard not to feel a little taken advantage of. Seldom has the early adopter tax (one I’ve paid often) felt more onerous. But because I’ve already spent all that money, I’m not going to give up on Emotiv or my brain. I’ll stick with it some more, and keep training my brain, probably with the same tenacity I used my Wii Fit.


Rick Dakan is a novelist and former game designer whose books include the Geek Mafia trilogy from PM Press. For more of his prose and musings, go to rickdakan.com.

Could this be the first smart car for quadriplegics?

Portfolio 15

In a time where the driverless automated car is becoming a modern reality, we are provided with great potential to make things previously improbable if not impossible suddenly possible.

My interest was piqued when I came across design plans for car that could be controlled by a driver with quadriplegia. At first the idea seemed mere fantasy but as I spoke to transport designer Rajshekhar Dass and learnt move about the control of technical devices through brain waves, facial gestures and infinitesimal movements the idea seemed more of conceivable.

Rajshekar Dass is a car designer from Mumbai, India currently based in Turkey. He has an impressive design history which includes fronting a winner team of the 2016 Michelin Challenge Design for the Google Community Vehicle and  Winner of the  Vehicle, Mobility and Transport Design 2014-15 A’Design Award for a Micro Taxi among awards. He’s interned for Volkswagen in Germany and worked in a range of car dealerships, so has seen car technology from a range of angles.

He detailed his rationale for a car designed specifically for people with quadraplegia, a cohort of people who until now have only featured in the automated cars of the future as mere passengers. Such a design is the first of its kind:

Audric Design basically was designed with a particular person in mind, Sam Schmidt, former Indy Racing League driver. He was made paraplegic due to a racing car accident in 2000.  He wanted to get back on the track but came back as an owner rather than driver. Everyone loves driving so my interest was how can we use today’s technology to solve these problems for an audience that are generally overlooked? How can today’s tech be used to enable the same driving experience that he enjoyed previously?”

It’s always interesting to learn how a designer approaches the design experience, Dass revealed:

“the first point of research is the capabilities of the human body when functionality has been impaired. In paraplegia the brain signals do not reach the human organs  bellow the neck, so the brain signals are basically lost. I thought, what if you could use today’s technology to enable the signals to be transferred to the computers onboard of a car instead, so you can give a rebirth to the whole driving experience?”

Dass explained that emerging technology like movement through brain wave signals,  gesture and facial recognition and sensor technology s along with  augmented reality made his design more than a well intentioned concept.

There’s precedence here. For example, in 2010  Emotiv released the Emotiv Epoch+, a commercial wearable device designed to enable users to play computer games on a screen through functioning as a brain-computer interface device (Admittedly the design was not without it’s challenges as this review attests).

Then in 2014, Ian Burkhart became the first paralyzed person to use neural bypass technology to pick up and hold a spoon using his own brainpower, with his abilities increasing overtime.

audric_vehicle_5

How could it work?

Dass stresses that the car would be completely autonomous in the first instance. But over time, the driver’s gestures and motions would be recorded by the car; for example, as the car is taking a right turn, the driver might be tilting his head. In this way, the car is learning from the driver instead of the driver learning from the car.

Dass explains further:

“There would be a series of levels which would be detected by the AI in the car, that slowly give the control to the driver. e.g. starting with audio and air-conditioning controls first. The next level could be controlling a little bit of the motion and slowly you’d graduate levels as you would in a game with the help of gesture recognition, eye movements and brain mapping. Once the car is confident as to the skill of the driver the complete controls would be available to the driver. but at the same time the car would still have the control over all the systems, as a built-in safety feature.

I’m aware that many people with paralysis experience involuntary movement such as spasms and jerking. I wondered if a car could be smart enough to distinguish these movements from voluntary actions.

Dass agreed:

It’s a good point. The car is completely autonomous and the AI is constantly monitoring the driver’s motions to learn his/her actions, and since the AI is specially developed for paralyzed drivers it can recognize such involuntary movements. Since the AI is also scanning the brain and can understand that the motion has no connection with the brain signals it can be tagged as involuntary action and not require a reaction”.

As well as its driving capabilities, the car would also be designed specifically with the needs of the driver in mind with a rear entry door suited to a wheelchair which would be specially designed to become the driving seat in the vehicle. Dass explained that the project has only been recently published online and he is keen to explore his ideas further with people with disabilities and associated organizations to enable further development.

In an era where ideas as seemingly bizarre as Google’s patent for“sticky” technology to protect pedestrians if they get struck by Google’s self-driving cars, a mind powered car doesn’t seem all that strange at all.

Dass is working on a range of diverse projects currently and judging by the ingenuity inherent in his design portfolio, this is simply an example of things to come.

Screen Shot 2016-07-22 at 14.51.24

The post Could this be the first smart car for quadriplegics? appeared first on ReadWrite.

ReadWrite

(57)

Author: admin
Device Daily Photo