What it Was Like to Listen to the 2017 Solar Eclipse
by Beth
WBEZ (Chicago Public Radio) had me come to the studio yesterday morning to talk about using the eclipse soundscapes app to experience Monday’s solar eclipse. If you missed it, you can listen to the 12-minute interview online here. I was relieved when host Tony Sarabia opened the interview by saying he’d tried the app himself and found it cumbersome.
It was.
I didn’t want to have to say that, though, and Tony’s opening let me off the hook. “It’s a work in progress,” I said. “And I’m grateful they’re even trying.”
The app was a success in one very important way. It got me outside Monday to be part of the community. There I was, iPhone in hand, alongside strangers in sunglasses and neighbors using DIY pinhole eclipse cameras made from cereal boxes.
As the eclipsed neared it’s 87% high mark for Chicago, the delivery guy from the downstairs take-out joint offered to lend his special sunglasses to my husband Mike to take a look. Mike could say “Yes!” without feeling obliged to stand by and describe things to me –I was busy swiping my iPhone. That, or putting it up to my ear to listen. I had to laugh when I told the radio host that, “Some people must have thought I was an idiot, listening to my iPhone while they all were looking up to the skies!”
The Harvard Solar Astrophysicist behind eclipse soundscapes is Henry “Trae” Winter, described as a scientist with a penchant for scientific engagement projects. He was building a solar wall exhibit for museums when he first noticed some “accessible” exhibits merely included the item’s name in Braille, while other exhibits — including his own — had no accessibility component at all.
“Winter began to brainstorm an astrophysics project that would use a multisensory approach to engage a larger percentage of the population, including the visually impaired community,” the app says. . “The ‘Great American Eclipse’ of August 2017 seemed like the perfect opportunity.”
As the eclipse progressed Monday, I sensed the air feeling a bit cooler. The wind seemed to pick up a bit as well. The app advertised a “rumble map” that was supposed to vibrate and shake to let me feel different features of the eclipse, but I was never able to get that feature working. The sound on the eclipse soundscapes app did work, though. Any time I ran my pointer finger over the screen I’d hear a whir that sounded like a low-pitched kitchen blender. When the blender ran faster, the pitch would go up, and that meant the light was really bright there. When my finger slid over the moon, the kitchen blender turned itself off=completely dark.
The app narrated the eclipse’s progression in real time, too, and during my WBEZ interview Tony Sarabia read a snippet of an eclipse soundscapes description out loud in his own voice:
Projections of light from the sun’s outer atmosphere called helmet streamers extend in all directions from behind the moon. In contrast to the black, featureless moon, the pale, wispy streamers appear as delicate as lace. The largest streamers have a tapered shape that resemble flower petals.
On air I pointed out that the description he read was “poetic.” Notice how it’s written using things we can touch, like lace and flower petals? Specialized imagery description techniques developed by WGBH’s National Center for Accessible Media were used for the real-time narrations.
Mike and I were out there Monday for about a half hour. I took iPhone breaks from time to time to eavesdrop on the group next to me discussing where they’d looked for their special sunglasses, how long the line at the Adler Planetarium was that morning, what they’d found on the NASA site, and what they were seeing through the sunglasses they eventually managed to get their hands on.
It was all pretty cool, until a TV news helicopter decided to hover overhead. My little eclipse soundscapes app didn’t stand a chance. With all that real-time whirring going on above us. I couldn’t hear a dang thing!
Read part one of Beth’s take on the Eclipse Soundscapes app