My friend Pablo had a request to learn how I type things with my eyes, so I thought I’d share a bit, and also share the rabbit hole I’ve gone down!
First of all, an eye-gaze device is one of the most helpful things ever invented for a person with ALS, or pALS. The ability to communicate ideas and concepts, or help a seventh grader work on his science fair paper, or to be able to ask for a beer, or even talk to people over the phone if needed, this is a great equalizer for me, since a hallmark of ALS is eventually getting locked in. Since eyes don’t stop moving, people don’t have to stop communicating.
How it works is fairly cool and almost simple. In the easiest form, a camera monitors what your eyes do, moving left, right, up, down, etc. Then the software determines how long your eyes stay still while looking at the screen, and then does a selection based on that. Simple! Your eyes are the mouse cursor!
I kid, it is incredibly difficult, sort of. I’ll break it down though.
First, regular cameras don’t have a very good framerate to be able to track rapid eye movement, so it takes a high framerate cameras solely focused on one eye to do it, which is why some devices have giant microscope looking things attached to glasses that look ridiculous, and aren’t practical outside of a lab. Another method is putting trackers or something on or around the eyes. Also they talk about dark eye and light eye, which I’m not sure what it means!
To mitigate high framerate optical cameras, they have come up with a better solution of using near infrared, or NIR, tracking system. NIR uses a reflecting trackers, so it shoots out a NIR beam and reads the reflection, then translates the position of the reflection as a cursor movement. Apparently, I think, it easier to have NIR cameras because they don’t require optical lenses, etc. Smaller space. The device can be a lot smaller, therefore you can have it attached to many different machines. Even glasses!
The device I have is called a Tobii Dynamvox, and it shoots out thousands of NIR beams every minute, so it is highly accurate.
There is a small lag, and you have to recalibrate it about once a week, but it’s fairly epic. Tobii has integrated a Windows tablet with the IR tracker and their software to allow me to type.
The way I have it set up is an on screen qwerty keyboard, and have it timed that if I hold my eyes over a letter or command for 600miliseconds, it types it. Actually, it took a lot of practice and I have a lot of errors still, but that is what backspace is for, and I have a special button that deletes the whole word, so it is faster!
I also use the same technology to verbally talk, since I can’t really talk anymore. This is a new frustration, because it takes me longer to type then hit speak, and everyone has usually moved on by then. But, I can communicate, albeit slowly, so please be patient with me as i type. The best part is beer doesn’t affect the typing!
But dangit, Pablo, you suggested a Heads up display, or HUD, and I have been searching and scouring for it. I am yet again disappointed in modern science and technology, a capability that has been used in airplanes and attack helicopters for decades, and no, hell no, no way we could make it available for civilians and the disabled! To top it off, the big companies have bought up the small companies, and killed off the tech! I kid you not, I got more mad reading about it. I want to have an eye-gaze tracker in my glasses or sunglasses, and be able to link it to my phone! Dangit! Oculus, intel,
and Apple all bought out companies and now that tech is gone. Or very, very expensive. I keep looking for glasses that would let me drive my wheelchair with my eyes, but again, that technology isn’t profitable, apparently, because it is years way. The Toyota mobility foundation is considering it for their 2020 award. Bah.
Anyway, I hope this was an interesting take on how I do things, and how I think!