The Second Machine Age Keynote Transcript.

Author
Stuart
Category
blog and press
Date

On Thursday 22 January I gave the keynote speech at a conference called The Second Machine Age, the conference is part of an ongoing series called Digital and Technology Networking Events. The event is organised by the Digital & Technology Fast Stream Forum and they’re discussing key future technologies, and how they might impact the UK and the UK government in particular.

I had a really nice time giving the speech, they laughed at the jokes and clapped at the end which is pretty much what you want! There were also lots of excellent questions at the end and I would love to work with this group again! 10/10 would definitely recommend. :-)

I want to say a special thank you to my speechwriters, you know who you are and you guys are AWESOME!

Transcript of my Keynote

[START]

[INTRO]

Hello there, hello, hi. Thank you so much for inviting me.

[wait]

Robots robots robots. I’ve come here today to talk about robots. Last year I founded a project called Robots and Cake, on the grounds that those are both good things to have, and we work to bring together roboticists, researchers and disabled people across Europe and the world.

So I go around talking to people about robots and working with coders and roboticists on interfaces for flying drones remotely, over the internet. And I get to use all these great robots and drones and it’s basically the greatest job in the world.

[HISTORY]

So let me back up a minute and give you some history of how I got into this thing.

On the 23rd of November 2009, the universe was 1.8 meters long by 60cm wide. That was the size of my universe. I’d just gotten out of hospital and was now completely quadriplegic. Actually, I was turned over to face the other side of the bed every two hours, so you could say the universe was 2.2 metres by 1.5 metres. It is a king size bed, after all.

I think you’ll agree that’s a really small universe. I was trapped in it for a long time, and I am incredibly lucky to have escaped.

So how did I get from there, immobile in that bed, to here talking to you? Well, with a lot of luck, a lot of help, and a lot of technology. Yep: the appliance of science! I got here with a critical mass of interconnected technologies.

I’m going to take you through a few of them.

So I needed to get online, which meant I needed a computer, but how do you use a computer when you can’t type or touch it in any way? (I called my first blog Escapology because it’s really like an escapology trick getting out of this situation. I’d like to see Harry Houdini try this!) Well, it turns out there’s software called Dragon Dictate, which was developed for note taking and dictation, but it also has a function where with keywords you can trigger scripts and macros stored on your computer. So I figured out all the common things you do on a computer and dictated a little applescript to automate each task, and then gave them keywords so I could launch the script for “send an email” or “search google”, and that got me limping along. From there I found a bit of software called Keystrokes and a Buddy button, which was a button I could actually press with the movement I have left in my hand - about 5mm of conscious control, and so then I was really cooking with gas because then I had most of a mouse.

And there, actually, for a long time, I stayed. And that was because there was no way to persuade anyone that any of my problems were solvable or that there was any possible improvement to be made. Now, not all problems can be solved. I can’t fix my spinal cord. But just because you can’t do everything doesn’t mean you can’t do anything. My legs don’t work, but do I need to walk to the shops? Or do I just need shopping. Does it matter that I can’t do things in the usual way? I don’t think so! I think that confuses the method with the goal, confuses the way you do things with what you want to accomplish.

But the truth is when you’re quadriplegic, unless you can persuade people to help you, your hands are tied. Your feet are tied. Your knees and elbows and shoulders are tied. You are pretty much tied.

So I got my next break when I got an NHS CHC Personal Health Budget and, instead of having to receive whatever pre-determined assistive technology the NHS would allot me (and I could seriously tell you some stories about that lot), they stopped fighting me and started letting me use their resources more effectively. So no more panels, no more forms to fill in, no more weird and obstructive interdepartmental penny pinching, just the whole budget, turned over to me. And you know what, I cost them less money this way, too!

[CASE STUDIES / HOME-LIFE ACCESSIBILITY]

Let me give you some examples of the kinds of things that are possible, with technology and resources. These solutions won’t work for everyone - that’s not what I’m suggesting. What I am suggesting is that technology and resources can be used to create customized and flexible solutions to problems.

It took less than 18 months of a personal health budget for me to get up out of bed and go back to work. I had been on the sick for ten years and was in an end of life scenario. I could not even get a wheelchair as, among other reasons, the NHS thought I was too disabled to use one.

One of the first things I did was commission a robot laser arm to come to my house and scan me! With the scan I had a company make me a giant foam seat that totally supported me in an upright position. We put that on the bed and I got myself a whole new wall to look at. With the foam seat, and a hoist, and a physio, and a ton of work, I eventually got myself hoisted out to sit next to the bed in a wheelchair I purchased myself second hand. Unfortunately, the wheelchair was too big to go through the doorway of my tiny bedroom, so it didn’t get me that far, but now I could see out of the window.

So I’m sitting in this room and I can’t turn the lights on myself so if it gets dark and there’s nobody about, I just have to sit here in the dark until someone gets back, which is rubbish. So I found these lightbulbs, LiFX, that are wifi connected so I can turn them on with an app. Which is great except the app is fixed sideways on my iPad and I can’t turn the iPad round. So I can’t use the app. Argh!

But you know what? It’s okay they didn’t think about me, because these lightbulbs are also scriptable. They left the door open a crack: I can write some javascript to turn the lights on when the sun goes down. And I can get that data - sunset in my location - freely from the Weather Channel on a web service called If This, Then That. When you lower the barriers, even by just a little, you allow people to solve their own problems. That’s how the light gets in.

[MAIN CASE STUDIES: DRONES AND TELEPRESENCE]

I could give you examples like this all day, but I have to move on to the greatest thing I found in that room, a TED Talk about drone flight. I instantly saw how drones could give me back a sensation of movement: of how they could get me out of my house, my bedroom, my body … just out of there!

Now, I wasn’t the first person to think this, and eventually I got in touch with Professor Chad Jenkins and his team at Brown University. They’re working on an extension to the Robot Operating System called EMPOWER— look it up it’s really cool. They’re the ones that introduced me to the Parrot AR drone. I’ll show you some video of me flying some of these things later. Think about that: I can’t move my body, but I can fly.

Robots. Are. Awesome.

The Parrot drone isn’t meant for disabled people— in fact, out of the box I can’t fly it at all. But because it’s scriptable, because it’s hackable, I can find ways to use it myself. A couple of months ago I gave a speech at Wired2014, using Nodecopter built on node.js to fly a drone there, from this office, over the internet. Nodecopter is brilliant, and so is the Parrot. But through robotsandcake I’m working with Kevin Finisterre on Operation Quadricopter. Kevin is building a new drone, one designed to be flown with my head. Hooked up to an Oculus Rift - a kind of games console built into goggles, I will be able to actually experience flight.

None of this would have been possible with closed systems.

The last thing I want to talk about is telepresence. Right now I’m here because technology’s allowing me to be so: I can talk to you because of Facetime but technologies like Skype, Facetime, GoogleHangout, and software for remote computing etc are allowing lots of people to have science fiction like powers - it’s almost like a transporter. But it’s crucial for me. These technologies weren’t built with disabled people in mind but you can imagine how important they can be. With a video link I can beam into a hospital appointment, a job interview, a conference. So here I am on Facetime which would have been impossible even a couple of years ago.

But - and here’s the coolest thing - sometimes I go places in a giant robot. It’s called a Beam, and it’s like a four foot high friendly iPad on wheels. The Beam allows me to go places I otherwise couldn’t: I’m using robots to physically access the world, to participate in it. And now I’m talking really, about the extensible self: the Beam lets me see things I wouldn’t have been able to see, and do things I wouldn’t have been able to do. It’s not a new concept - You guys extend your bodies all the time and do impossible things: you put on gloves and reach into your oven, you put on goggles and see underwater, you throw your voice down telephone lines, into the air, into space! We’ve gone to Mars with a robot. It’s not so different - for all of us, technology is the difference between a small universe and a big one. And sometimes a robot is better. It can look behind itself. It can zip down narrow corridors. It doesn’t need to breathe on Mars or at the bottom of the ocean.

But do you see: I’ve extended myself through this robot like a hand waving a flag out of a window: You can see me! Hello! I’m reaching out into the world through this screen.

I go to art museums in California, South Dakota, I hang out in the cafeteria at Brown. I visited the computer history Museum in California, and it was geek heaven! A few months ago I went to the National Museum of Australia using their museum robot, which is quite different to the Beam - but check that out too.

With telepresence it’s easier to go to Australia than it is to go up my road. That’s the thing.

(That’s not an exaggeration by the way. For ten years I never turned left out of my own street because there was this one curb that was nine inches high and I couldn’t get over it. It didn’t matter what was accessible beyond that! )

So an “extensible self” means using robots to move through the places you can’t go. If you’re able bodied, that’s useful, that’s fun, that’s cool. But if your mobility is restricted, it’s life changing. It’s everything. Because often you can’t go anywhere.

I had kind of short notice on this talk today or I would have come in a Beam, actually. I open the Beam app on my computer and choose which Beam to pilot. They show up in a list - all the Beams I have access to in all their locations. There’s a video feed and there are guiding curves on the screen showing me where I’m going, or where the Beam is going, rather, so it’s really easy to drive. Its an accessibility solution that we didn’t have five years ago - and a technology that most people still have no access to. In fact I still can’t get most hospital consultants to even take my calls on Skype, so I often can’t get healthcare, in the UK in 2015.

Making individual institutions and buildings accessible is really important, is a great first step, but each accessible building is not connected to an accessible world. It’s like having the first telephone: the signal can’t carry because there are no other telephones to call. This might sound obvious but unless every single metre between where I am and where I’m going is accessible, then none of it is. Here’s an example: I used to live in Manchester - I just moved to my amazing accessible smarthome in Yorkshire about three months ago. For the last decade I lived less than half a mile from the Whitworth Art Gallery, Manchester Museum, this actual building – loads and loads of wonderful buildings and cultural institutions and all with ramps and lifts, but I never went to any of them because I couldn’t get out of my own house.

And that’s true for a lot of people, in one way or another. Of course telepresence isn’t a solution for everybody, and it shouldn’t replace accessible architecture, but it can add to it. It can be another way in. It’s another way to be flexible. I have to live in this world that we have right now. I can’t just put my life on hold until we’ve completely restructured the public environment. I mean, I have tried that but there are only so many DVD box sets available.

[TAKE AWAY:]

None of these things are a solution for everybody, because there are no solutions for everybody. There are, instead, lots of solutions: an explosion of possibility. Everybody can break down problems in this way, and everybody can try to leave some room for different approaches, for the solutions you haven’t thought of to the problem you haven’t imagined. The Parrot drone wasn’t designed to help me escape that room, but it left enough wiggle room so I could use it for that purpose. And there are tons of things out there that just need a little wiggle room. There are lots of right answers. The universe is a big problem. We can solve it and keep solving it, bit by bit, practically forever.

END

presentation, stuart, press, accessibility, hardware, and were

comments powered by Disqus