Assistive Technology Diary: How a College Student Uses AT on a Typical Day

An Amazon AlexaNovember is Assistive Technology Awareness Month, and to remind everyone the important role assistive technology plays in the lives of people with disabilities, Northern Illinois University student Alicia Krage is sharing an assistive technology journal she kept on a recent Tuesday at school. Interested in sharing your assistive technology journal? Let us know in the comments!

  • 8:00am: I ask Siri to open Spotify as I get ready for my morning shower. I then swipe on the screen until VoiceOver (the speech synthesizer that comes with iPhones at no extra charge) announces “favorites,” the name of my playlist, and I double tap that. I swipe again until it says “shuffle” and I turn on the music.
  • 8:25am I’m back in my dorm and I say, “Alexa, good morning.” Alexa announces upcoming events on my calendar, the weather, and then plays music.
  • 8:35am: I use Voiceover to read any unread text messages I might have. I then swipe through my messages until I come across Joe’s name, then type out a brief text message. VoiceOver calls out each letter as I touch them, and when my finger finds the letter I’m looking for, I use “direct touch typing,” which essentially means all I have to do is tap on the screen where the letter is, just like a sighted person would text. In this option of typing, my phone says the word after I hit “space” so I can hear errors and fix them before sending. My morning note is usually short and simple: “Good morning, how’d you sleep?” or something of that nature.
  • 8:45am: I swipe through my apps until VoiceOver says “Uber.” It automatically knows my pickup location as “school” (I programmed that in), so I swipe through until I hear “Dunkin Donuts.” It keeps a list of frequently visited places. I hit “request Uber X.”
  • 8:47am: VoiceOver reads the text I have dictated to the driver before I send it: “Just so you know, I’m blind so I won’t see your vehicle pull up. Please come get me when you arrive; I’ll be waiting outside.”
  • 8:55am: VoiceOver announces messages from the app. It tells me to “meet driver,” so I go outside.
  • 9am: As I’m ordering my coffee, I swipe through my apps until I hear “Dunkin.” I open the app and hit “pay” so I can pay from my card on my phone.
  • 9:05am: I’m seated by the door and go back to the Uber app. It says, “How was your trip?” I swipe through and rate the appropriate amount of stars for the trip.
  • 9:10am: My coffee is in front of me (the employees bring it to me). I plug in a set of earbuds into my laptop and use JAWS (the speech synthesizer I use with my PC) to connect to “Dunkin Donuts Guest” wifi.
  • 9:12am: I use the arrows to navigate through my documents until JAWS reads the correct title. It’s usually a document containing parts of a paper I need to finish.
  • 10am: My paper (or other assignment) is usually done by now, so I continue to use my speech synthesizer to navigate to Twitter and Facebook and catch up on social media.
  • 11am: I use VoiceOver to navigate through the apps on my phone until I find Uber. It knows my current location, so I navigate through my “saved places” until I hear “school.” I double tap and then hit “request Uber X.”
  • 11:02am: I paste the message into the text field for my Uber driver. It’s the same one I used earlier.
  • 11:05am: Voiceover announces to “meet driver” so I go outside and he calls out to me to let me know where he is parked.
  • 11:15am: Once back in the building, I use voiceover to rate the trip the appropriate amount of stars.
  • 11:17am: I use Voiceover to navigate through my messages until I hear my boyfriend’s name. I text Joe a brief message catching him up on my day and ask how his day is going so far. We text back and forth while I head down to the dorm cafeteria to eat.
  • 12:30pm: I’m done eating lunch and this is my time to decompress. This usually involves watching reruns of some of my TV shows. Sometimes I use JAWS to navigate websites like nbc.com to watch “This Is Us” reruns, or Netflix to find something to watch on my phone.
  • 2pm: It’s back to work for a little bit. Most of the time it’s studying, so I use my Braille Note apex to open a document containing my class notes. Sighted people need a screen to see what they’re typing and to use the internet, but I don’t. My Braille notetaker is essentially a screenless laptop, so as I type into my Braille Note, the words appear on a Braille display (a rectangular device with rows of pins that are raised and lowered to spell out letters in the braille alphabet) and I can just trace my finger over the dots to read my notes.
  • 3pm: I use VoiceOver to swipe through my messages until I locate one of my friends that I feel like calling. This is the easiest way to find them, rather than scrolling through my contacts.
  • 4pm: Before I head down to the dining hall, I do one more check of emails. I have the Outlook app on my phone, so I swipe through until I get to it. If it doesn’t say I have “new items,” it means I don’t have any emails – same for the default Mail app.
  • 5:10pm: It’s time to leave for my night class. I don’t use technology during dinner. As I am waiting for the elevator, I swipe through messages again to find Joe’s name, then press the “call” button. We chat briefly before I leave for class – a quick catch-up and ending with “Good luck in class” on his end. We make plans to talk later that night.
  • 6:00pm: Class has begun, and I will spend the next 2 hours and 40 minutes taking notes on my Braille note apex (I’d have to put on headphones to take notes using JAWS or VoiceOver, and that would make it hard to listen to the lecture!) Class is usually done at 8:40pm and I get picked up at 9pm. If we get out early, I use Siri to text the PACE bus driver on their mobile business phone to cancel my 9pm ride and a classmate drives me back.
  • 8:40pm: Class is done and I spend 20 minutes catching up on texts, Twitter and Facebook notifications, and respond to any emails I got in the last few hours.
  • 9:10pm: I’m back at the dorms and I navigate to my Spotify playlist again. I listen to music while I get ready for bed.
  • 10pm: I either use siri to “call my boyfriend,” or he calls me and VoiceOver announces his name while his ringtone plays. We talk for an hour – or at least try to keep it to an hour. I need sleep and have a somewhat early day tomorrow.
  • 11pm: I tell Alexa to set my alarm for 7:30am. She confirms with, “Your alarm is set for 7:30am tomorrow.” I make sure my phone is set to “do not disturb” and that it is charging. And then I call it a night.

Interested in reading more about assistive technology? Check out these articles:


 

Comments may not reflect Easterseals' policies or positions.


Please read our community guidelines when posting comments.


  1. Sydney Palese Says:

    Hi Mario,

    Thanks! And very interesting note.

    Sincerely,
    Sydney


  2. Sydney Palese Says:

    Hi Joan,

    Thanks for the feedback!

    Sincerely,
    Sydney


  3. Joan Clarke Says:

    Amazing and impressive!


  4. MARIO CORTESI Says:

    This was great! Ali is a former students of mine and I tormented her with math classes and chemistry–all of which she has forgotten, thank heavens.

    Ali should have mentioned that her braille note-taking device basically has only 7 keys for typing; 3 on each side of the spacebar. As I recall, there is an Enter key. I don’t know what model she has, but she can also backspace to delete.

    This is a far cry from the braille writing devices my students used a few years ago. The Perkins Braillewriter was a heavy, metal machine and thin cardboard paper was inserted. If you made any mistakes, too bad; you had to rub the dots down. It only produced braille, whereas the newer devices like Ali has can store documents and produce them either in print with a standard printer or in braille with a special embosser. Ali doesn’t need the embosser since she can look at any document the same as we guys who have vision and just read it as each line is displayed.


Leave a Reply