Throughout the 10 months that I’ve been writing Blind Confidential, I have made frequent reference to my repetitive motion injuries. When I do too much typing, my hands risks forearms and shoulders start to feel a lot of pain. Recently, I’ve done a lot of programming and have written a substantial amount of text for a variety of different projects.
In the past couple of weeks, the pain has gotten bad enough to cause me to return to my physical therapist and, yesterday, to purchase, install, train and start using Dragon dictation software. I bought a copy of Naturally Speaking 9.0 after Reading on the Nuance website that it requires little or no training. In reality , 9.0 requires the exact same training as previous versions so I wonder if the marketing people actually ran the program before writing the literature. This, compared to the pain in my hands, is a minor annoyance, didn’t take much time and the accuracy demonstrated in this latest version of Dragon NaturallySpeaking is really terrific.
I feel, however, that there exists some kind of connection between the writing Center in my brain in whatever part of my mind is responsible for the motor functions involved in typing. I find it is much more difficult to thinking clear sentences while speaking than while typing. Maybe my friend Will can provide some sort of cognitive explanation for this.
So, on this Saturday morning, I sit with headset on talking to my laptop. I’ve used dictation software before, especially when the hands and shoulders chose to punish me for typing too much. In the past I have used J.-Say and before that JAWBone as a bridge between Dragon and JAWS. I think I’ll need to send my buddy Brian a note asking about his most recent software and to see if it works with Dragon NaturallySpeaking 9.0 standard.
I hope to be doing a lot with dictation in the coming months. I’m not getting any younger and my repetitive motion injuries will not heal themselves. Thus, good old Blind Christian needs to start taking care of himself.
As this is the longest piece that I’ve written since installing Dragon NaturallySpeaking 9.0, I must say that I’m tremendously impressed with the accuracy in this new version of the product. Without J-Say installed, getting proper feedback from JAWS 7.1 is a little tricky so my progress throughout this document has been a bit slow. I find, though, that with echoes set to all JAWS talks too much but it does provide a reasonably good level of usability. I still recommend that anyone who wants to use dictation software with JAWS should contact Brian and get his excellent bridge utility that combines jaws and Dragon in a more seamless manner.
A few years ago, while visiting Georgia Tech, I had an interesting conversation with a person from the RERC on workplace accommodations about a problem that I call quote Screen Reader Syndrome. Quote I did an informal study that has no statistical significance, of approximately 40 software engineers working for the same company. The age distribution between the 20 or so blind programmers and the roughly 20 sighted ones was nearly identical as was their years of experience in software engineering. Computer programmers suffer from repetitive stress injuries more frequently than the population at large. My little study of individuals working in a very similar environment showed that approximately 17 of the 20 blind programmers had some level of repetitive motion injury while only three or four of the 20 sighted programmers had any such injury.
While my sample was statistically insignificant and my study not scientific, I believe it demonstrates a reason to do further research into the different levels of injury experienced by computer users who access their machine using a screen reader versus those who use vision, a mouse and more mainstream techniques of accessing information. I’ve also done a handful of other highly nonscientific tests were I compared the number of keystrokes it took me to find a piece of information in an unfamiliar Excel spreadsheet versus the number of keystrokes and mouse clicks required of a sighted user. Depending upon the specific spreadsheet, it would sometimes require me to make more than 100 times the number of actions using my hands as that of a sighted user. Thus, I coined the phrase quote Screen Reader Syndrome quote to describe injuries resulting from the higher level of stress placed upon a blind person’s hands than those of our sighted counterparts.
Some other screen reader users have described a variety of different techniques they use to decrease the amount of time in the number of keystrokes they use to find a piece of information. These techniques typically require one to copy and paste information out of something like a spreadsheet where context is delivered through positional information and pasting the information into a text editor, where blank cells used for formatting purposes disappear and searching becomes simpler. While these nonstandard techniques may work well for some, they are never included in screen reader training and, as they require leaving the major application, these techniques further separate a blind computer user from his sighted colleagues. If the intent of a screen reader is to provide access to mainstream applications, these workaround techniques might provide value for some people in the short term but do nothing to improve the long-term goal of providing reasonably equal access to the applications used in a workplace.
Of course, in the cubicle farms of corporate America, dictation software might not be the answer to reducing the physical stress caused by using keyboard in tense access technologies. On this matter, though, I have no experience and, therefore, no idea as to whether two blind people could share a cubicle and both dictate without interfering with each other’s work.
The answer to this problem will come from advances in user interface design for nonvisual computing. Much more research needs to be done in this area and much more investment needs to go into trying out new concepts in the screen readers deployed to consumers.
Afterward
The other day Mike Calvo sent a very long comment about my post on Web accessibility and my general ennui regarding the progress of technology transfer from the mainstream to systems that we blinks can use. Mike is entirely correct in his assertion that I see the glass quote as half empty, quote he neglects to notice, though, that I also believe that my drink was poured into a dribble glass. Thus, I not only take the pessimistic approach but I also have the paranoid feeling that the general ignorance of Universal design demonstrated by mainstream technology companies is part of some kind of intergalactic joke as, in an enormous number of these cases, adding accessibility would cost pennies and provide solutions for people like us as well as those who do not self identify as having a disability.
I agree completely that technology for people with vision impairment has improved tremendously in the decade or so since I started paying attention. My frustration and sadness come from the knowledge of how mainstream technologies work and how inexpensive and simple it would be to apply the principles of universal design to these products.
As the chairman pointed out the other day, a large number of Americans are reaching an age where, if not a total disability, physical and health related issues will start causing minor vision impairment, hearing loss, decreased agility and other problems which, through the appropriate application of universal design principles to everything from home entertainment systems to refrigerators, dishwashers and other appliances, can easily be overcome and said individuals will be able to carry on happy, healthy and independent lives. Universal design in mainstream products will go a great distance to creating an accessible world for those of us who use access technology or other suboptimal methods of dealing with various items around our homes.
— End
“I feel, however, that there exists some kind of connection between the writing Center in my brain in whatever part of my mind is responsible for the motor
functions involved in typing. I find it is much more difficult to thinking clear sentences while speaking than while typing. Maybe my friend Will can
provide some sort of cognitive explanation for this.”
I would guess that you are trying to think about what to write next at the same time as you are speaking. If so, this would lead to two activities trying to use the language areas of the brain, e.g. Broca’s and Wernicke’s areas, at the same time and this isn’t possible. If you silently think about something using natural language, a process known as internal speech production, then the brain regions involved with semantic association for language, Broca’s and Wernicke’s areas, have been shown to become active.
Howdy comrades!
As soon as I sober up, I’m gonna’ get one of those little old dragons to dictate to myself! Do they require a green card, or can I just slip ‘em a couple of coals under the table, if you get my drift. I met several nice folks here at The Last Resort who also suffers chronic pain, and there‘s staff that are chronic pains in and of themselves. Got to go, BC: I hear Nurse Ratchet lurking outside my suite. I hope you get to feeling better. Regards, Chairman Mal: Power to the Peeps!