firestarters transcript

Author
Stuart
Category
blog
Date

On Tuesday, 21 June I gave a speech at the Google Firestarters 20 conference. These conferences have been happening a couple times of year for the past four or five years, the name of this year’s conference was Artificial Intelligence (AI) or Intelligence Augmentation (IA) and was looking at the impact robotics and artificial intelligence mail may not have on humans as we move into the future.

It was a great conference with only four speakers, and the other three speakers were incredibly interesting people. The event is organised by Neil Perkin at Only Dead Fish and is definitely worth checking out if you ever have the opportunity to do so.

Anyway, here’s the transcript of my speech,

Transcript

Robots robots robots. I’ve come here today to talk about robots. In 2014 I founded a project called Robots and Cake, on the grounds that those are both good things to have, and we work to bring together roboticists, researchers and disabled people across Europe and the world.

So I go around talking to people about robots and working with coders and roboticists on things like interfaces for flying drones remotely, over the internet, and driving robots around museums from my bedroom. And I get to use all these great robots and drones and it’s pretty much the greatest job in the world.

It was interesting thinking about what to say here today because, you know, we spend a lot of time talking to people about how to apply software development principles: agile, rapid revision, iterative problem solving… to health and social care.

But I’m guessing you are all sold on Agile.

But what I can bring you is something pretty interesting that perhaps you’ve not considered before… There’s a group of people that have been living in the future their whole lives.

Disabled people are more aware than most of the exact places they meet with the world, and try to interact with it. And they’re more aware than most that the place where the self meets the world can change: the world can be at the end of your fingertips, but it can also be at the end of a lever or a screwdriver, or the other side of a computer screen–hello! Right now I’m here because technology’s allowing me to be so: I can talk to you because of Google Hangouts - almost like a transporter, it’s jumping me out of my spaceship and onto your planet.

Most of the technologies I use weren’t built with disabled people in mind, but you can see now how important they can be. With a video link, I can beam into a hospital appointment, a job interview, a conference. Here I am on your screen, which would have been impossible just a few years ago. You would never have met me a few years ago. Nobody met me. I was stuck lying on one side in one room looking out of one window and that was the world. There are still people locked up all over the country, all over the world, who no-one can hear from because they just physically cannot get out of their house.

And that sucks for them but it also sucks for you, because some of those people have stolen a march on this new 21st century challenge of the dislocated self, of the distributed self, of physically interacting with the world at a distance using technology. Quadriplegics and other severely disabled people have been living this future for a long time.

It’s not a new concept - You guys extend your bodies all the time and do impossible things: you put on gloves and reach into your oven, you put on goggles and see underwater, you throw your voice down telephone lines, into the air, into space!

You know, I sometimes go around different places in an actual robot. The one I’ve used most is called a Beam, and it’s like a four foot high friendly iPad on wheels. It allows me to physically move around spaces, environments, places I otherwise couldn’t: here I’m using robots to physically access the world and to physically join in.

And now I’m talking, really, about the extensible self.

The networked robot, the remote controlled drone, lets me see things I wouldn’t have been able to see, and do things I wouldn’t have been able to do.

For all of us, technology is the difference between a small universe and a big one.

And here’s a second thing you might not have thought of: sometimes a robot is better. It can look behind itself. It can zip down narrow corridors. It can fly.

It doesn’t need to breathe on Mars or at the bottom of the ocean.

I mention this because when able bodied people think about disability they often focus on replacing the broken body part. And don’t get me wrong, hands are great–but not when you want to put them into an oven.

People confuse the method with the goal; My legs don’t work, but do I need to walk to the shops? Or do I just need shopping. Does it matter that I can’t do things in the usual way? I don’t think so! I think that confuses the method with the goal, confuses the way you do things with what you want to accomplish.

Do you see? I’ve extended myself through the robot like a hand waving a flag out of a window: You can see me! I’m reaching out into the world through this screen.

Do I need to be there to actually be there? No.

I haven’t left this town since I moved here in 2014, but via telepresence I go to art museums in California, South Dakota, I hang out in the cafeteria at Brown. I visited the computer history Museum in California, and it was geek heaven! I went to the National Museum of Australia using their museum robot, which is quite different to the Beam - but check that out too.

With telepresence it’s easier to go to Australia than it is to go up my road. That’s the thing. That’s not an exaggeration by the way. Our train station has a flight of steps, so that’s that. The end. Even back when I was much less disabled, I had never turned left out of my own street because there was this one curb that was nine inches high and I couldn’t get over it. It didn’t matter what was accessible beyond that. That curb was the end of the line.

When I say this, people immediately say, “well, the train station should be accessible,” and yes, it should! We should fix that. But you know, making individual institutions and buildings accessible is really important, is a great first step, but right NOW, each accessible building is not connected to an accessible world anyway.

It’s like having the first telephone: the signal can’t carry because there are no other telephones to call. In fact you could call this a last mile problem. This might sound obvious but unless every single metre between where I am and where I’m going is accessible, then none of it is. Here’s an example: I used to live in Manchester - I just moved to my amazing accessible smarthome in Yorkshire in 2014.

So for the last decade I lived less than half a mile from the Whitworth Art Gallery, Manchester Museum – loads and loads of wonderful buildings and cultural institutions and all with ramps and lifts… but I never went to any of them because I couldn’t get out of my own house.

And that’s true for a lot of people, in one way or another. Of course telepresence isn’t a solution for everybody, and it shouldn’t replace accessible architecture, but it can add to it. It can be another way in. It’s another way to be flexible.

I had to get to work this morning. I have to live in this world that we have right now. I can’t just put my life on hold until we’ve completely restructured the public environment. I mean, I have tried that but there are only so many box sets available.

[PAUSE FOR LAFFS I MEAN IT]

So the “extensible self” means using technology to move through the places you can’t go. It means your eyes are looking down out of an Parrott A.R. drone as it flies over the trees. It means your ears are listening to a mouth in New York speaking into a phone. It means your hands are doing surgery with a robot arm. This isn’t new. We’ve been doing this for years, one piece at a time, but now let’s start PUTTING everything together.

Once you start thinking about the ways in which we already act - and sometimes act better and more efficiently - at a distance, you can think about designing better. When you’re paralysed you’re obviously highly motivated to do this, and you’re also deeply experienced, because when you can’t move your body at all, everything is action at a distance.

Every single physical effect you exert on the world must be articulated to, and executed by an external actor, whether that be another person or a smart lightbulb or a voice dictation app.

The lightswitch on my wall may as well be on the moon for all I can reach it.

A lot of adaptations are about “pushing the button”, about, conceptually, replacing the hands I can’t use. But you know that saying, when all you have is a hammer, everything looks like a nail? Well hands are good at turning knobs, dials, lightswitches… all tools made for hands! But they just aren’t necessary for triggering processes. I don’t actually need to access the lightswitch on my wall. I can buy a smartbulb from LIFX or limitless, one that is scriptable, and I can script them to turn on and off as it gets light or dark. Or I can pull that data, sunrise in my location, freely from the Weather Channel on If This, Then That. I can simply hook the lightbulb up to the sunset and it can sort itself out when it gets dark. I don’t need to touch anything or think about it at all.

Do you see the difference?

You don’t need to imagine all the possible uses for your product. You just need to design your product openly, with enough wiggle room, so that people can adapt it to their own purposes. Sometimes those purposes are radically different from your intention, because some people live radically different lives with needs you can’t possibly understand or predict.

So, I lost the ability to type quite slowly so I went through a whole load of bizarre keyboards and arm rests and so on but eventually it was all gone, and this was in the noughties so there wasn’t really anything like Google Now, Cortana or Siri. But I needed to get online, but how do you use a computer when you can’t type or touch it in any way? (I called my first blog Escapology because it’s really like an escapology trick getting out of this situation. )

Well it turns out Macspeech Dictate, now Dragon Dictate, which was developed for note taking and dictation, also has a side function where, with keywords, you can trigger scripts and macros stored on your computer. So I worked out all the common things you do on a computer and dictated a little applescript to automate each task, and then gave them keywords so I could launch the script for “send an email” or “search google”, and that got me limping along.

So now I could Google, and if you can Google you can figure out anything. [light ting on teeth!]

But without Dictate’s afterthought side function, I would have been stuck using the main “accessibility” product designed for people with no hands, which is an onscreen keyboard on which you spell out everything, agonisingly slowly, letter by letter, by scanning through the alphabet in order over and over and clicking on the right letter. Seriously!

Omg

But the point is, this is actually a tremendous discipline for identifying and articulating physical processes, just like wireframing the entire world really.

Anyway, I could itemise every step I took out of that bed, but AS eventually I built an entire smarthome, we’d be here forever. But it’s not just practical needs that can be met by extending the body - it’s real, human ones too. So I have to move on to the greatest thing I found in that little room: a TED Talk about drone flight. I saw these little quadcopters flipping and zooming around, and the speed of the feedback from the camera.

And I instantly saw how drones could give me back a sensation of movement: of how they could get me out of my house, my bedroom, my body … just out of there!

I just want to slow down so you can think about that for a sec because there’s a real split between able bodied people and disabled people when I tell that story. Able-bodied people go: “oh, cool”, but they don’t really feel it. Disabled people straight away see the appeal of ease, agility, and SPEED.

The idea of agility is mindblowing, really, when you are paralysed. You never move quickly. You never move easily. We call it disability speed - even things like wheelchair lifts go mindbendingly slooooowly. And you’re never, ever, in control. Just forget about being in control of anything. So this is really transformational stuff, to be able to zoom out through a window and into the air. And nobody planned it; nobody expected it; it’s just technology crashing into people and changing the world.

Now, I wasn’t the first person to think all this, and eventually I got in touch with Professor Chad Jenkins and his team at Brown University. They’re working on an extension to the Robot Operating System called EMPOWER— look it up it’s really cool. They’re the ones that introduced me to the Parrot AR drone. Think about that: I can’t move my body, but I can fly.

Robots. Are. Awesome.

[TAKE AWAY:]

None of these things are a solution for everybody, because there are no solutions for everybody. There are, instead, lots of solutions: an explosion of possibility. But everybody can try to leave room for different approaches, for the solutions you haven’t thought of to the problem you haven’t imagined. The Parrot drone was never designed to help me escape that room, but they left just enough wiggle room so I could use it for that purpose. And there are tons of things out there that just need a little wiggle room. There are lots of right answers. There are lots of ways of doing things. You don’t need to imagine them all. You CAN’T imagine them all.

You’re looking at me and you’re thinking I’m trapped in my body, and yes, it’s true, I am! I totally am. But well, so are you, because you’re trapped in the PARADIGM of the body as the tool that you think you need to explore the world.

So take my hand; and let’s escape to the future together.

END

comments powered by Disqus