[From the last episode: We looked at how long messages might need to be broken up into multiple packets to be sent and reassembled later.]
Today we do one of our occasional departures from the narrative to look at a current-event situation. The particular event that spurred this is the recent loss of an updated Boeing 737.
No Stalling!
The biggest issue here is the tension between whether the pilot or the airplane is in control. There have certainly been circumstances where a particular pilot or other crew member proved him- or herself unworthy of the role. Thankfully, those events have been few.
The answer, too often, is to decide that the machinesIn our context, a machine is anything that isn't human (or living). That includes electronic equipment like computers and phones. may do a better job than the humans. And, for some tasks, that may be a reasonable conclusion. But when it comes to the big life-and-death decisions, total trust in technology – at least today’s technology – may be misplaced.
In the case of this airplane, the assumption was that the plane itself could prevent the pilot from climbing too quickly and stalling out – and stalling is something that no one wants. So, if it detected the danger of stalling, it could intervene and slow the climb, effectively tilting the nose of the plane down.
It does this with the help of a sensorA device that can measure something about its environment. Examples are movement, light, color, moisture, pressure, and many more.. And, if everything is working correctly, it should work just fine. While, during normal day-to-day operation, that may never be needed (since the pilot would never take the plane close to a stall), in unusual circumstances – say, when something else is off and the crew is distracted – it could save lots of lives.
Assuming the technology is working correctly.
Um… Oops!
There has always been cockpit tension between when the pilot and when the plane should be in control. Neither is perfect, and it’s undoubtedly a delicate balance. In this case, the sensor failed, and so the airplane thought that they were in a stall when, in fact, they weren’t. They were simply taking off. The plane tried to avoid the stall by pushing the nose down – killing the climb.
This is when the pilot, realizing that something in the plane isn’t working right, would shut off the automatic feature and fly the plane manually. And pilots know how to override this feature. Or, at least, they used to know. The new updated version was, by all reports, both harder to override and unknown to the pilots. Meaning that, in the heat of emergency, the crew of this particular plane couldn’t override the behavior; they had to fight it. And, ultimately, they lost that fight.
The first, and more minor, place to which this takes me relates to what we in the industry call the user interfaceThis is an interface between something and the user of that thing. In a car, the pedals and steering wheel are all part of the user interface. You’re probably more familiar with the user interface (or UI) on a computer. If that interface involves graphics, then it’s a graphic user interface, or GUI (pronounced “gooey”). (UI), or the human-machine interaction (HMI). Yes, the UI needs to be easy and intuitive (even though “intuitive” is often misplaced), but, mostly, we learn how a UI works and then proceed accordingly. In this case, the pilots thought they knew how to override the plane, but they didn’t – because the UI had changed without them realizing it.
Who’s in Charge?
But the bigger issue here, it feels to me, is a shift in balance between who’s in control. There are numerous examples in the industry where technologists are deciding that they know what their users want better than their users do. And, with the advent of artificial-intelligence-in-everything, it gets even worse.
With the IoTThe Internet of Things. A broad term covering many different applications where "things" are interconnected through the internet., I’ll make a couple of somewhat extreme examples. You are free to argue with the specifics of the examples, but it’s the bigger takeaway that matters. I’ve often made fun of the notion of a connected toaster, since… well, who needs the internet to make toast anyway? But let’s assume that Behemoth Technologies has decided that this is what you want and manages to eliminate all non-connected toasters from the market and makes sure the toaster won’t work without being connected. (You know, for securityRefers to whether or not IoT devices or data are protected from unauthorized viewers..)
In my made-up scenarios, let’s say that you have been feeling not so good in the belly, and that the old standby – burnt toast – will help to quell the rumbles. I could see one of two things happening:
- The internet consults health information and decides that burnt toast isn’t good for you, so it’s going to turn down the heat regardless of what you want. You’re welcome.
- The internet consults masses of data that show that most people don’t like burnt toast. Clearly you must have made a mistake, so we’re going to do what we think you really want and make regular toast. You’re welcome. In fact, they might even remove the doneness control, since they (think that they) can do it better.
This idea that, somehow, AIA broad term for technology that acts more human-like than a typical machine, especially when it comes to "thinking." Machine learning is one approach to AI. – or technology in general – can solve all of our problems better than humans can is flying all over the news these days. And it creates an environment where the harder-to-override airplane technology might be considered a feature.
A Delicate Balance (If Done Right)
Now… to be clear, I am not going to claim that I know what was going on inside Boeing engineers’ heads when this decision was made. And I’m pretty sure that they were doing this in good faith, thinking they were helping. And, in many cases, I’m sure they did help. Just not in this one case.
But it’s that environment, where we assume that, ultimately, technology will improve human lives by removing the humans from the decisions, that’s troubling. Yes, this isn’t a clear black-and-white issue, and certainly within industries like the airline industry, I’m sure there are serious conversations going on about this.
But for consumer IoT items, there’s much less at stake for the user (so you didn’t get your desired toast doneness; boo-hoo!). And there’s lots of money to be made. And it’s tempting for the machine to control as much as possible, not just out of a sense of “we know better,” but, even worse, out of a desire to control our behaviors in a way that ultimately makes them more money.
So, when you look for new, smart technology, make sure that the smarts are working for you, not against you. We have these amazing brains that are by no means anywhere close to perfect, but they do a pretty good job. While we also have some pretty amazing technology, make sure that your technology isn’t cutting your brain out of the loop. Smart technology can save you effort, it can save you time, it can make your life better. Just make sure it’s not taking away your ability to make your own decisions. Despite the Silicon Valley bluster, machines don’t always know best.
Leave a Reply