Science Fiction Prototyping. Brian David Johnson
The story went like this:
There is trouble at the Piazzi mine on the dwarf planet Ceres 1. The facility’s all-robotic mining crew is behaving strangely. One day a week, they are shutting themselves down completely. No work is getting done. The owners of the mine are losing 2.5 billion euros a week. Something has to be done.
Dr. Simon Egerton, a roboticist and freelance investigator, is brought in to find out what is really happening. Egerton and his body guard, Nigel Kempwright, are given two days to uncover the mystery.
“What is all this nonsense about mining bots going to church on Sundays?” Kempwright tugged on his shaggy blond hair. “Are they serious? I mean really.
Are they serious? Who’s the bright guy who can’t just reprogram them or something?”
“They tried that I think,” Egerton replied. “It didn’t work. I guess everything they’ve tried hasn’t worked.”
Landing at the Piazzi Mine … was like descending into the mouth of hell … the place looked like a giant worm had burrowed deep into the rock, leaving its eggs along the interior of the shaft to hatch and continue its work. These eggs were the buildings and outbuildings of the mining facility, screwed precariously into the sides of the tunnel, and connected by a maze of spidery catwalks and walkways. Two massive mining chutes dove down into the blackness of the pit.
Egerton and Kempwright emerged from the battered shuttle to the vast loading deck of the Piazzi Mine … The deck was crowded with the tremendous robotic loaders, recklessly filling mega-haulers. Their shuttle was dwarfed by the size of the operation and Egerton and Kempwright were barely visible amongst the giants.
Everything looked normal. Egerton imagined the AI agent sub-systems orchestrating every action of the various bots and machines; reacting to the mine’s changing conditions, updating, correcting, always taking in information and reacting. When Egerton thought of the dirty little mine in that way it was quite delicate and beautiful; hardware and software dancing elegantly together.
“This is our only hotel at the mine,” the security bot said stopping in front of a small shabby trailer. “It doesn’t get much use as you can imagine.”
The automatic doors hissed open and Egerton could see another bot waiting for them inside.
“And what’s that?” Kempwright asked, still standing on the narrow street. He pointed to a massive warehouse directly opposite the mine’s main shaft.
Egerton stepped back out into the street and instantly saw the massive structure. It stood out not because of its size but because it was spotless in a swarm of grime.
“That is the church,” the security bot replied calmly.
The church was larger than he had expected. The calm air smelled of industrial lubricant and electricity. Across the expanse hung a fifty foot electric-yellow cross, crudely constructed out of two mine support beams. Lined up in front of the cross, in massive neat rows was every bot from the Piazzi mine.
Moving down the center aisle, Egerton could see it wasn’t just the ambulatory bots, the ones that moved freely that had come. It was all the robots. Even the massive diggers had been un-bolted from the mine shafts and carried in; the wheel-less, the legless, the immobile; no bot had been left behind. Egerton found himself searching the air for signs of the unseen nano-bots.
Stopping mid-way up the aisle, Egerton knew he had to get back to catch the shuttle. Easing back to the door with quiet reverent steps, he searched the bots for any sign of activity. Their optical scanners were closed, their activity lights dimmed. Nothing moved, nothing stirred. It was so still that Egerton could hear the blood coursing through his ears.
* * *
“Well, then I guess we aren’t understanding each other then because it sounds like you’re tell me that our mining bots now somehow believe in God.” XienCheng’s [the mine’s administrator] jowls shook with frustration.
“It’s not that,” Egerton tried again. “It’s not God. They don’t believe in God. They believe in going to church, in the action of going to church.”
“But that doesn’t make any sense,” XienCheng interrupted.
“Yes!” Egerton slammed his hand down on the table. “Now you’re getting it.”
XienCheng drew back in an awkward fear. “No. No, I’m not.”
“They aren’t doing anything in the church,” Egerton explained. “They aren’t worshiping God or saying prayers or holding a service; they’re just going to church. It’s the action and it’s not supposed to make sense. That’s the point. For some reason they latched onto the idea but now they need it. It keeps them safe.” (Johnson, 2009)
Mystery solved! The robots at the Piazzi Mine were going to church precisely because it was irrational. Like some human behavior, irrational actions can be as constructive as rational decisions. Most people think that robots and artificial intelligences (AI) can only make rational decisions. But we humans make good decisions and bad decisions, and it is because of this that we learn much faster than if we only made “correct” or rational decisions. It is this complexity of learning that allows humans to operate in highly complex environments.
Now you might be saying: “But that’s just science fiction!”
But you would be wrong …
The following abstract is from a paper with the impressive title: Using Multiple Personas in Service Robots to Improve Exploration Strategies When Mapping New Environments. This research was the work of three scientists Dr. Simon Egerton, Dr. Victor Callaghan, and Dr. Graham. [Note: You may have noticed that the name of the main character in Nebulous Mechanisms and one of the authors of the research paper are the same: Dr. Simon Egerton. This was originally done as a homage to Dr. Egerton as he was the first person to explain to me the work that the team was doing over a pint of beer in Seattle Washington. When I named the character none of us realized that the SF prototypes would be so popular and effective, spawning a whole series stories. More about that in Chapter 7. If it is any conciliation—the main character of the stories pronounces his name “Egg-er-ton” whereas the good doctor pronounces his name “Edge-er-ton.” I am indebted to Dr. Egerton (“Edge-er-ton”) not only for is mind-expanding research but also for being so understanding and having an excellent sense of humor about me dragging his name all over the world.] Published in 2008, the paper explains the emerging research by the scientists who were using a concept from philosophy and psychoanalysis in their robotics and AI development work.
Abstract
In this paper we describe our work on geometrical and perception models for service robots that will support people living in future Digital Homes. We present a model that captures descriptions of both the physical and perceptual environment space. We present a summary of experimental results that show that these models are well suited to support service robot navigation in complex domestic worlds, such as digital homes.
Finally, by way of introducing some of our current, but unpublished, research we present some ideas from philosophy and psychoanalytic studies which we use to speculate on the