Friday, March 3, 2017

Engineers Bring Tech of the Future to Life

 By 

One group of these students is creating a brain-controlled wheelchair.
The endeavor is part of a senior design project, the engineering department’s capstone.
“Our vision [for the chair] is for people who don’t have arm motion or are paralyzed from the neck down,” Joel Kraft, electrical engineering senior, said. “It would be an assistive technology to gain mobility, which isn’t very common for people with those disabilities right now.”
Three major components make the chair’s design possible: a brain-measuring headset, small but powerful computer, and motor controller.
While the team meets each week to coordinate their progress, the three each work on one element throughout the week.
Computer engineering student veteran Jason Gardner works on the headset software. The Emotiv headset translates practical commands, like move forward, into a technologically usable signal.
“You have to train it first so it can recognize what those brain commands mean with your specific brain wiring,” Kraft said. Users practice pushing and pulling a digital picture. The picture zooms in and out to reflect how accurately the headset is able to measure the user’s command.
That signal is then sent to a single-board computer called a Raspberry Pi. Kraft is integrating this software with the headset and motor system.
The team originally designed to use the brainwear headset and computer in place of an electric wheelchair’s joystick. Since they weren’t able to reverse engineer how the joystick worked, they were left to start from scratch.
Electrical engineering senior Mengjia (Megan) Yi ensures the team’s motor controller effectively receives that signal. A motor controller amplifies a small signal, like from the computer, so the motors can carry out a large task, like moving the chair.
The concept for the brain-controlled wheelchair was conceived by the team’s advisor, Sudeep Pasricha.
All project ideas are originally created by faculty members. Engineering seniors then apply for projects and are matched based on skillset, team compatibility, and preference.
Each team provides advisors like Pasricha with weekly reports and early copies of all presentations and documents.
Pasricha wears many hats at CSU: he is a Monfort-endowed professor, Rockwell-Anderson-endowed chair of of computer engineering, director of the embedded-systems and high-performance computing lab, and a professor in both the department of electrical and computer engineering and computer science.
While his busy schedule contributes to a laissez-faire advising approach, he is always available should the team get stuck.
“The students have to figure out how to make it work,” Pasricha said. “I help them along the way, but I want [them] to gain from this whole process.”
Another skill the seniors practice is fundraising.
While the ECE contributes $600 of funding to each project, many groups need to pursue additional funding.
“One of our main problems was deciding if we were going to actually use a wheelchair,” Kraft said. “It’s really expensive to get a working one, but we were able to find a cheaper one on Cragislist.”
Kraft’s scholarship, the Anshutz Family Foundation, and Keysight, a local measurement company, both donated to the project.
The team was also able to utilize rollover funding from the previous year, as well as technology like the Emotiv headset. Many design projects continue year to year, although the main goal typically evolves.
Last year’s project was a brain-controlled smart home. The assistive technology was intended to help those with disabilities carry out everyday tasks such as unlocking doors, turning on lights, or changing the television.
Graduate students and advisors like Pasricha ensure the knowledge and work each team generates doesn’t get lost in the annual transition.
“This year I wanted the team to try to do something a little trickier, to control a wheelchair,” Pasricha said. “[It] has a lot of parts. To get all those parts working together safely is not that straightforward.”
The team agrees that designing with safety in mind is more difficult than they expected.
“There’s two sides to safety: there’s system safety and user safety,” Kraft said.
System safety means managing the electrical currents to prevent anything from exploding.
“We’ve fried many things so far, a lot of things really,” Kraft said.
User safety refers to interactive measures like an emergency stop switch, which would disengage the motor should the chair respond inappropriately.
Kraft thinks time will prevent the team from outfitting the chair with motion sensors.
The teams are preparing to present their projects at the Engineering Days symposium in April in the LSC ballroom. Many potential employers attend, so the projects offer tangible examples of the soon-to-be graduates’ work.
Students put in between 150 and 250 hours including fundraising, design, and implementation and develop a host of documentation on project websites.
“The senior design project is meant to give experience to the students,” Pasricha said. “A lot of times in their coursework they are just working by themselves, but this whole program prepares them for what it will be like in the real world when they have to work on a team.”
Students often feel a similar sense of fulfillment as they near the end of their projects.
Kraft said, “I couldn’t have been blessed with a better project. My dream job would be working with embedded systems to develop assistive technologies and brain-interfacing that could potentially help people in the future.”

Thursday, March 2, 2017

Steering A Turtle With Your Thoughts



Researchers have developed a technology that can remotely control an animal’s movement with human thought. 

Asian Scientist Newsroom | March 2, 2017 | Technology AsianScientist (Mar. 2, 2017) 

Researchers at the Korea Advanced Institute of Science and Technology (KAIST) have developed a brain-computer interface (BCI) that can control a turtle using human thought. Their findings have been published in the Journal of Bionic Engineering. 

Unlike previous research—most notably in insects—that has tried to control animal movement by applying invasive methods Professors Lee Phill-Seung and Jo Sungho of KAIST propose a conceptual system that can guide an animal’s moving path by controlling its instinctive escape behavior. 

They chose a turtle because of its cognitive abilities as well as its ability to distinguish different wavelengths of light. Specifically, turtles can recognize a white light source as an open space and so move toward it. They also show specific avoidance behavior to things that might obstruct their view. 

Turtles also move toward and away from obstacles in their environment in a predictable manner. The entire human-turtle setup is as follows: A head-mounted display (HMD) is combined with a BCI to immerse the human user in the turtle’s environment. The human operator wears the BCI-HMD system, while the turtle has a ‘cyborg system’—consisting of a camera, Wi-Fi transceiver, computer control module, and battery—all mounted on the turtle’s upper shell. Also included on the turtle’s shell is a black semi-cylinder with a slit, which forms the ‘stimulation device.’ This can be turned ±36 degrees via the BCI. The human operator receives images from the camera mounted on the turtle. These real-time video images allow the human operator to decide where the turtle should move. The human provides thought commands that are recognized by the wearable BCI system as electroencephalography signals. The BCI can distinguish between three mental states: left, right, and idle. The left and right commands activate the turtle’s stimulation device via Wi-Fi, turning it so that it obstructs the turtle’s view. This invokes its natural instinct to move toward light and change its direction. 

Finally, the human acquires updated visual feedback from the camera mounted on the shell and in this way continues to remotely navigate the turtle’s trajectory. The researchers demonstrates the animal guiding BCI in a variety of environments, with turtles moving indoors and outdoors on many different surfaces, like gravel and grass, and tackling a range of obstacles, such as shallow water and trees. This technology could be developed to integrate positioning systems and improved augmented and virtual reality techniques, enabling various applications, including devices for military reconnaissance and surveillance. 

The article can be found at: Kim et al. (2016) Remote Navigation of Turtle by Controlling Instinct Behavior via Human Brain-computer Interface

Source: Korea Advanced Institute of Science and Technology. Disclaimer: This article does not necessarily reflect the views of AsianScientist or its staff. Read more from Asian Scientist Magazine at: https://www.asianscientist.com/2017/03/tech/turtle-human-brain-computer-interface/

Wednesday, March 1, 2017

Our Voices are Stronger Together -- ALS Advocacy Conference in DC!

2017-als-advocacy-conf

ALS National Advocacy Conference Registration Now Open

There is a need to continue to educate Members of Congress about ALS and its true impact on people living with ALS and their loved ones. This is where you and your voice come in. Advocates – people living with ALS, their families, friends, doctors and researchers – successfully sharing their stories with members of Congress will result in more legislative victories. Your personal story, delivered first hand, is one of the most powerful tools we have.
That is why The ALS Association invites you to join the entire ALS community as we unite in Washington, D.C. for the 2017 National ALS Advocacy Conference. This is our opportunity to share your ALS story and let Members of Congress know the true nature of the disease and why more must be done now.
The public policy priorities that The Association and the ALS community will be focused on this year include asking Member of Congress to 1) cosponsor the ALS Disability Insurance Access Act (S.379/HR.1171), 2) cosponsor legislation, soon to be introduced, to protect access to complex rehabilitation technology, and 3) appropriate $10 million each for both the National ALS Registry at the Centers for Disease Control and Prevention (CDC) and the ALS Research Program at the Department of Defense (DOD). For more information about the ALS Disability Insurance Access Act (S.379/HR.1171) click here.
This year’s conference will be held Sunday, May 14th – Tuesday, May 16th at the J.W. Marriott, in Washington, D.C. After a day and a half of meetings and training sessions, ALS Advocates from across the country will take to Capitol Hill for meetings with their legislators on Tuesday.
To attend the 2017 National ALS Advocacy Conference, please register online at www.ALSA.org/advocacy/advocacy-day. This website also provides information such as the hotel – the J.W. Marriott, travel information, a conference outline and other important information for participants.
Conference registration fees are waived for people with ALS and for one caregiver traveling with them to the conference.
For other participants, the 2017 conference has a $175 non-refundable registration fee for attendees who are affiliated with The ALS Association, an ALS Association Chapter or other affiliated organization. This fee covers a small portion of conference costs, including meals, transportation to Capitol Hill and briefing materials. Registration fees for children are $25. The fee for non-affiliate attendees is $350.
For the J.W. Marriott hotel, the single/double occupancy rate is $299 plus tax per night; $319 + tax for triple occupancy; $339 + tax for quadruple occupancy; with a maximum of four guests per room. Once you register for the conference, you will be provided with a direct link to the J.W. Marriott’s reservations website.
In order to request an ADA accessible hotel room, you must contact Michael Coscia at adaroom@alsa-national.org. Your e-mail should include your hotel confirmation number. For all additional questions about hotel reservations or transportation, please contact Michael Coscia.
General questions about the 2017 ALS National Advocacy Conference can be sent to advocacy@alsa-national.org.
The voices on the Hill during the Fly In were heard well, but were just a start. Let us join forces to make our voices louder by participating in the 2017 National ALS Advocacy Conference. We look forward to seeing you there and working together to champion these important priorities for the ALS community!

I Played The First-ever Eye-controlled Instrument At The Sydney Opera House


As a result of a complicated birth I have severe cerebral palsy. My condition affects my fine motor skills and causes my body to endure involuntary muscle spasms. Despite my physical disability I have always been an outgoing and creative person. Throughout my life I have been known to overcome barriers and push myself to achieve the next step at every turn. This is a tough gig sometimes, however when you don't fit into the so-called box, you strive to become a game changer. My past is full of critics who grossly underestimated my capabilities, but I just convinced myself to keep pushing forward and change their perception and help spread awareness.
For me, being independent in my business life means so much.
I can remember how difficult it was progressing through school not knowing what the future held due to having a severe physical disability. Today while illustrating what is possible to younger generations I hope to build their confidence and allow them to challenge their own barriers.
Through the use of technology, I am able to be fully independent in my work environment, from using a Macbook for my graphic design and website development, to securing my SLR digital camera to my wheelchair. For me, being independent in my business life means so much.
I have always been surrounded by music. I found that by immersing myself in beats and rhythms decreased my physical pain and lessened the severity of my spasms. One of the reasons I took up music photography was to satisfy a strong urge I had to join my musician friends on stage. Until recently I was forcing my inner musician to be satisfied with just doing event photography, knowing I could never physically pick up an instrument and start playing.

A few years ago, I was lucky to cross paths with Dr Jordan Nguyen at a conference. He was delivering a presentation about his mind-controlled wheelchair. At the time, Jordan was predominantly working with quadriplegics. I wasn't aware that he had never conversed with a non-verbal person with Cerebral Palsy before. He later confessed after our meeting his perspective drastically changed when he realised there was a whole different level of people with disabilities he could work towards assisting.
As time went on Jordan and I became really good friends. Just like me, Jordan has what it takes to break new ground and create new technology that allows others to push forward in their lives. Having very similar goals to Jordan I started becoming involved in his social business Psykinetic where I have a strong sense of belonging. The team at Psykinetic have a common sense of purpose and want to create new technologies purely to empower people in the same physical situation as me.
We went to a few music gigs where Jordan witnessed firsthand how much I wanted to perform like my rock and roll musician friend, Steve Balbi, from Mi-Sex. Every time I watched Steve on the stage, I could hear my inner musician screaming "You need to play music and perform!" However, my logical side just dismissed that idea as a result of my physical limitations. Well... that was until Jordan decided to find some way to get me on stage and enable me to physically play an instrument.
One morning I received a phone call from Jordan asking for my participation on a new project... without hesitation I said "yes". The idea was for me to play classical music at the Sydney Opera House alongside the Australia Piano Quartet, with the first ever eye-controlled instrument from Psykinetic. I hadn't used eye control technology before and knew absolutely nothing about classical music, and I was sitting there thinking how are we going to make this happen, it is classical music at the Opera House?! That is a massive thing to accomplish, however I knew with Jordan's and my determination we just might make this dream happen.
It was very strange because this iconic building was where I had watched some of my favourite musicians play in the past. Then, suddenly, I was on the same stage.
I picked up the software really quickly. However I still needed to be taught how to play classical music on the instrument with a seemingly impossible deadline of four weeks.
Time to bring in the Australia Piano Quartet. James Wannan took the lead as my music teacher. Playing at the Opera House with the piano quartet was a very surreal feeling. It was very strange because this iconic building was where I had watched some of my favourite musicians play in the past. Then, suddenly, I was on the same stage. If a standing ovation is anything to go by, I think we pulled it off.
So what is next for me? Do we have time for me to list everything? I would love to continue my work with the Australia Piano Quartet and grow my music skills, and spread awareness to what is possible with this type of technology from Psykinetic. My desire is to assist in making their technology more available so more people like myself have the opportunity to explore what is actually possible.

Unlocking the potential of eye tracking technology

Via Tech Crunch https://techcrunch.com/2017/02/19/unlocking-the-potential-of-eye-tracking

The concept of measuring and responding to human eye motion, or eye tracking, isn’t new, but the past year saw a rising interest in the technology. There have been a slew of acquisitions of eye tracking startups by large firms and the rollout of several devices and software that support eye tracking.
“Eye tracking sensors provide two main benefits,” says Oscar Werner, vice president of the eye tracking company Tobii Tech. “First, it makes a device aware of what the user is interested in at any given point in time. And second, it provides an additional way to interact with content, without taking anything else away. That means it increases the communication bandwidth between the user and the device.”
There’s a chance that soon eye tracking will be a standard feature of a new generation of smartphones, laptops and desktop monitors setting the stage for a huge reëvaluation of the way we communicate with devices—or how they communicate with us.
“In the past year eye tracking technology moved from being a promising technology to being adopted in commercial products in a wide array of consumer segments simultaneously,” Werner says.
Dominic Porco, chief executive officer at Impax Media, a digital advertising company, says less expensive and more potent hardware; new open source software platforms; and new easier and faster ways of obtaining data to train algorithm models have driven the progress in eye tracking technology.
“Companies like NVIDIA have launched products with more powerful GPUs at competitive prices, accelerating the image recognition speeds,” Porco says.
Porco adds that popular crowd-sourcing marketplaces such as Amazon Mechanical Turk have enabled the collection of larger and broader datasets to train recognition algorithms. “These developments have accelerated progress in eye tracking technology significantly, allowing researchers and developers to go faster through their cycles of experimentation and implementation.”
But any technology won’t grow unless it can fulfill specific demands and use cases. And in the case of eye tracking, there seems to be no shortage.

Virtual Reality

Businessman in virtual computer room.
In a push to create a more immersive experience, VR headset companies are making large investments in eye tracking technology. In fact, in many ways, eye tracking is seen as the technology complementing VR.
“VR is about immersion,” Tobii’s Werner says. “But a VR headset without eye tracking will assume that I am speaking to the person in front of my forehead. It is approximating my area of interest to the direction of my forehead. We all know this is not true. Our real interest is where I am looking, and there is often a difference between where I look and the direction of my head. VR headsets need to take your gaze into account be become truly immersive.”
Eye tracking technology is key to foveated rendering, a technique where only the portion of the image that lands on the fovea (the part of the retina that can see significant detail) is rendered in full quality.
With foveated rendering, Werner says, there will be a 30 to 70 percent decrease in the amount of pixels drawn, a processing power saving that can translate to higher framerates and the ability to achieve high quality output with 4k headsets as opposed to the 24k levelneeded to meet natural human vision level.
Also, eye tracking technology will make it possible to reduce graphics distortion caused from not taking eye position account when rendering VR graphics, Werner adds.
Fove, a kickstarter-funded project, is the first VR headset to have embedded eye-tracking. Others are not far behind. In the past months, Google and Facebook acquired eye tracking startups Eyefluence and Eye Tribe respectively, and are expected to embed the technology in their future products.
And SMI, a leader in eye tracking technology, has initiated several partnerships and projects to bring eye tracking to both standalone VR head-mounted displays and smartphone slot-ins.
Eye tracking will also be part of the upcoming Khronos VR API, an open standard under development which has garnered the support of Oculus, Google, NVIDIA and others.
“The view that eye tracking will be a key part of second generation headsets is shared by a large number of VR HMD vendors,” Werner says. “This drives technology development and innovation.”

PC Gaming

Another successful VR demo, at HTC, that showed room scale gaming that actually worked.
Another successful VR demo, at HTC, that showed room scale gaming that actually worked.
For decades, we’ve used gamepads, joysticks, keyboards, mice and other peripherals to make PCs and video game consoles understand where we’re looking at. With eye tracking, your computer already knows what you’re looking at and can react accordingly.
“When you want to interact with an object you just look at it and press a button,” says Werner. “The computer understands which object you want to interact with. You don’t need to drag the mouse or controller to the place you are already looking.”
Whether it’s about hacking at an object, aiming at a target, designating a location for the game character to run, or simply changing the direction of the point-of-view camera, eye tracking might make it a whole lot easier for gamers to interact with the gaming environment. This can mean a big deal for games that require a high level of mouse and controller handling.
While it might render previously challenging games too easy, it can also pave the way for games that are much more fast paced.
Eye tracking can also create a cleaner and less intrusive user interface.
“In games, graphical artists spend a lot of time creating beautiful environments,” Werner says, “and are in a constant battle with UI interaction designers that need to place UI elements on top, since it clutters the immersive feeling.”
With eye tracking, Werner explains, you can hide the UI or make it transparent and only make it visible when the gamer’s gaze is directed toward it. “This creates a more immersive feeling and solves a constant battle between graphical artists and UI designers,” he says.
For simulations and virtual worlds, eye tracking can enable gaze aware objects, where game objects or characters will react to the gaze of the player to make the simulation more realistic, Werner says. This means that you’ll have to be careful not to stare too long at a mercenary’s purse when entering a tavern in your favorite RPG.
Tobii itself offers a line of tack-on eye-tracking devices and eye-tracking embedded laptopsand has worked with gaming companies on eye-tracking-enhanced versions of popular games such as Rise of the Tomb Raider, Deus Ex and Watch Dogs 2.
It is unlikely that eye tracking will replace controllers any time soon, but according to Werner, thanks to the technology, “PC games will take into account one of the most powerful interaction methods we as humans have, our eyes. PC gamers will be able to utilize an additional control mechanism complementing the mouse and the controller. This will drive more natural and interaction without taking anything away.”

Medicine and accessibility

Medicine doctor hand working with modern computer interface as medical concept
Beyond the consumer level, the benefits of eye tracking expand to other realms where measurements of human gaze are key to obtaining results and insights.
“There is an increasing interest in using eye tracking to help diagnose — and potentially treat –neurological disorders,” says Bryn Farnsworth, science editor at biometric research company iMotions. “For example, infants usually like to look at images with people’s faces—scenes that have a social element.”
Farnsworth explains that infants that go on to develop autism are much more likely to have a preference for images that feature geometric shapes, while for children with the Williams syndrome, Farnsworth says, the situation is reversed, meaning that they show a marked preference for social scenes in comparison to neurotypical children.
This all suggests, Farnsworth says, “that the analysis of eye movements may help guide early diagnosis.”
research paper by students at UCSD states that eye tracking technology holds promise as an objective methodology for characterizing the early features of autism, because it can be implemented with virtually any age or functioning level.
Labs such as iMotions are helping researchers obtain those metrics through eye tracking devices in order to better understand and assess the conditions of patients.
Eye tracking company RightEye uses the technology to help physicians in administering test and finding symptoms for illnesses ranging from simple concussions to Alzheimer’s disease and dyslexia, and to help treat children with autism.
Eye tracking can also be a breakthrough for patients with physical disabilities, especially as affordable, consumer-level devices become available. “This is a large area for eye tracking and it is evolving,” says Tobii’s Werner. Gaze keyboards and control panels powered by eye tracking, Werner explains, give people with diseases such as cerebral palsy and spinal cord injuries a means to communicate, control their environment and develop skills through therapy.

Advertising

facebook-live-ads-gif
In the current state, the best metric advertisers get from ads are impressions and click numbers. But those numbers do not precisely reflect the effectiveness of ad campaigns because a lot of what gets counted as impressions goes to waste on non-human sources. That is something that changes with eye tracking technology.
“The advertising industry is currently in the midst of some major upheaval when it comes to universal standards for measuring ad impact,” says Porco, the Impax chief executive. “The whole concept of ‘viewability’ is now being redefined to make more sense in the age of ad blockers and bot traffic.”
With eye tracking technology, online advertisers will be able to measure exactly how many actual human eyes actually view their ads when they appear on the page. While gaining precise metrics would be nearly impossible until such time as every computer and mobile device is embedded with eye tracking technology, using eye tracking does give insights into how users interact with ads.
In the physical world, however, eye tracking is already showing promise.
“Market research firms are experimenting with directly measured biometric data to precisely determine the composition of people in out-of-home media environments such as retail stores, for audience measurement purposes,” Porco says.
Porco’s company, Impax Media, is investing heavily in eye tracking technology along with other computer vision techniques to collect attention metrics from its proprietary in-store advertising screens. “We’re big believers that the future of the ad industry is going to be grounded in attention metrics, as opposed to impressions, and eye tracking is, hands down, the best way to track attention,” Porco says.
The data, Porco says, helps advertisers and location partners to assess audience interest in various messaging angles, and to correlate this information with parameters like location, timing and demographics. “It’s great for media buyers seeking to get the most for their budgets, and for store managers dealing with questions about everything from inventory to staff shift schedules.”
While retailers always benefit from collecting information about customers, the area is somewhat a gray and is often subject to controversy and falls across privacy regulations. However, Porco underlines that there’s no need to collect identity parameter in order to glean useful insights, and anonymized data about gaze point, age and gender along with duration of view suffice.

Market research

research-kit
It’s important for market researchers to “evaluate people’s interactions and expectations across the whole omnichannel customer journey and its key touchpoints,” says Simone Benedetto, UX researcher at TSW, an Italy-based market research lab.
Recent advances in eye tracking, Benedetto explains, open up new possibilities in both lab and real world neuromarketing tests.
“It is crucial for us to involve users in product and service design and evaluation,” Benedetto says. “This doesn’t mean just to ask them their opinion, but to collect objective data coming from their eyes and brain while interacting with the product or service.”
TSW uses mobile eye tracking units along with other wearables in order to get precise user and customer metrics on a wide variety of products and services, both digital (such as online ads, mobile apps, websites, software and device control panels) and physical (such as print material, product packages, cars, home furniture and retail stores).
Being able to weigh the user’s natural interaction with products and services enables researchers to identify real usability problems and frustration points and collect actionable information that gives insights into customer satisfaction and engagement and drive design decisions.
“One of the great advances last year has been the introduction of object tracking in relation to the analysis of data from mobile eye trackers,” says iMotions’ Farnsworth, referring to the process by which specific visual features can be delineated from a scene, and information about how that particular feature is attended to can then be recorded.
“This means that an individual can, for example, wear portable eye tracking glasses, interact with their environment normally, and how they attend to certain features can be automatically analyzed —how long they look at a street map when they’re out walking, if they notice an advert that they passed,” Farnsworth says. “Being able to automatically understand how specific features are attended to clearly has great ramifications for understanding humans, and using that knowledge further.”
“From my perspective there’s a huge market behind the exploitation of eye tracking into UX-neuromarketing investigations,” Benedetto says. “Eye tracking allows the implicit measurement of user behavior, and turns that measurement into quantitative objective data. We have only relied on subjective data for years, and it’s definitely time for a change.”

The future of eye tracking

Close Up of blue eye with computer circuit board lines, digital composite
Close Up of blue eye with computer circuit board lines, digital composite
Tobii’s Werner told me he believes a new paradigm of PC usage will emerge, where eye tracking is a fifth modality that, in combination with touch screens, mouse/touchpad, voice and keyboard, will make computers much more productive and intuitive. “Gaze always precedes any kind of action that you do with mouse, keyboard and voice, so much smarter user interactions will be designed using these technologies,” he says.
As vision is the most used sense among human beings, being able to track and measure it digitally will have a great impact on how we make our intentions known to computers, wittingly or unwittingly.