Skip to content

March 21, 2023

Frankenstein vs. the Gingerbread Man

By Leah Brown

In a previous post, we mentioned how the new book by Mark Schwartz, Adaptive Ethics for Digital Transformation, brings up the metaphor of Frankenstein vs. the Gingerbread Man. But what in the world is he talking about, and how does it apply to adaptive ethics and digital transformation?

Metaphors matter when we’re framing ethical issues. In the news and on social media we see a lot of people talk about artificial intelligence or genetic engineering as a sort of Frankenstein’s monster—out of control, sociopathic, and malevolent. 

But, as Mark asks in his book, what would happen if we thought of runaway AI as more like the gingerbread man? Like Frankenstein’s creature, the gingerbread man escapes and runs away, but unlike Frankenstein’s creature, he doesn’t wreak havoc on the surrounding villages. At most, he makes fun of his pursuers until he is at last destroyed…uh, eaten.

But still, what does this have to do with ethics? Let’s look at what Mark has to say in the Introduction to his new book, Adaptive Ethics for Digital Transformation.


Anyway, it’s worth going back to Mary Wollstonecraft soon-to-be-Shelley’s novel Frankenstein to see what it actually says, because it says some useful things for our discussion here.

First, about Frankenstein: Frankenstein is not actually Frankenstein. I mean that Frankenstein is the name of the scientist, not the monster. In fact, there is no monster; Frankenstein’s creature is referred to as “the creature.” That’s “creature”—a word with the same root as “create.” He’s also called “fiend” and “wretch,” just as I am in some circles. And Frankenstein is not a doctor. He’s just referred to as Victor Frankenstein. 

The book is not about a monster who escapes from Frankenstein’s control. Victor is disgusted by the creature’s ugliness and disturbed that he has done something as unnatural as giving birth to an adult, so he runs away, leaving the creature to fend for itself. “Accursed creator! Why did you form a monster so hideous that even you turned from me in disgust?” it asks. The creature—really pretty ugly by anyone’s standards, since he’s made of other people’s body parts sewn together—has trouble making friends and finding love, and swears vengeance on Victor after trying to negotiate with him to create a female companion. 

Frankenstein—Victor, I mean—has turned his back on his child, which was no more considered good form in 1819 than today. He’s refused to take on the obligations of creature-rearing, including his responsibility for the creature’s ethical development. “The wicked are like God—they too do as they please,” says the sacred Tamil Kural of India, and Victor has indeed usurped the role of God—and, incidentally, of mothers. 

We learn about Victor’s moral failings—not the creature’s—as the wretch/fiend goes on his killing spree, much as we learn about our own biases as we examine the activities of our artificial intelligences. To read Frankenstein as the story of an escaped and evil creature and worry that AI might be his cousin is to miss the point.

While Victor remains a morally deficient human being, the creature becomes a better “person” as he learns empathy and kindness by observing other humans. This theme of the creature as Victor’s “double” runs throughout the book, much as the theme of artificial intelligence as our double, subject to the same biases and errors as humanity, runs through discussions on responsible AI. Even a superficial reading of Frankenstein suggests that we should hug our robots and artificial intelligences, not recoil from them. We should teach them to be kind. We should nurture them.

Frankenstein is a very personal book about the responsibilities of creators, written at a time when Mary Wollstonecraft was creating little human beings in the usual, natural way with her kooky but apparently good-looking poet-boyfriend, Percy Bysshe Shelley. One of her babies had already died in its infancy, and she was about to lose two more. Victor is a caricature of a Romantic-era hero, as was perhaps—ahem—Percy himself. 

Frankenstein is a fable about creation—like entrepreneurial creation in our emerging digital world. Long discriminated against for his deformities, Frankenstein’s creature has been unfairly maligned, misnomered, and—you’ll agree—mis-mental-modeled.

The Gingerbread Man

The gingerbread man, unlike Frankenstein’s creature, does escape his creator—the cook. The tale begins when the birthday dessert she’s baking for little Billy jumps out of the oven and runs away. The cook chases the gingerbread man and is soon joined in the chase by her husband, a neighbor, a postal worker, and various other upright citizens, not to mention a dog, a cat, a monkey, and a fox. The gingerbread man not only outruns them but taunts them, turning back now and then saying “Run, run, as fast as you can. You can’t catch me, I’m the gingerbread man!”

At this point there’s some dispute about the historical record. In some accounts, a fox tricks the gingerbread man into riding across a river on his back, and then eats him. According to other sources, the gingerbread man is finally caught and fed to Billy, who eats him one limb at a time (“Ouch! There goes my leg!”). In a related Eastern European story called The Kolobok, the fox tricks the Kolobok by praising his singing. In German versions, The Thick Fat Pancake, the pancake allows itself to be eaten by two hungry orphan children. In any case, the gingerbread man’s virtue—tastiness—is finally realized, to the benefit of humanity.

What if artificial intelligence is not Frankenstein’s creature but an escaped dessert, thumbing its nose at us comical, famished, slow-moving diners? A slice of pizza making fun of the sleep-deprived software developers coding their gradient descent algorithms?

Restatement of the Problem

If you’re following me—rather than chasing dessert—here’s what I’m getting at. Digital transformation requires that we change our ethical assumptions about business, assumptions we’ve long taken for granted, because they are largely inherited from legacy bureaucratic ways of thinking. If you don’t believe me, please suspend your disbelief until the next chapter, when I’ll explain.

Those ethical assumptions structure our everyday ways of acting in a business. Ethical decisions are not generally big-picture choices between good and evil—restraining Frankenstein monsters—but the scads of everyday, small matters that cross our desks or flicker up on Zoom. We struggle every day to manage conflicting imperatives, many of which arise because we are squatting in both the digital and bureaucratic worlds. 

Because innovation is such an important part of the digital world, we are constantly releasing little Frankenstein critters and ambulatory desserts into the world. Let’s not close our minds in fear but rather engage with them. It turns out that they have a lot to teach us—about ourselves.

Intrigued? Read more about the ethics of the digital world in the forthcoming book from Mark Schwartz, Adaptive Ethics for Digital Transformation: A New Approach for Enterprise Leaders (Featuring Frankenstein vs. the Gingerbread Man).

- About The Authors
Leah Brown

Leah Brown

Managing Editor at IT Revolution working on publishing books for the modern business leader.

Follow Leah on Social Media

No comments found

Leave a Comment

Your email address will not be published.

Jump to Section

    More Like This

    Go from Unplanned Work to Planned Work with Integrated Auditing 2.0
    By Clarissa Lucas

    The Scenario Picture this... You're starting your work week off just as you do…

    Announcing the Spring 2023 DevOps Enterprise Journal
    By IT Revolution

    We are delighted to announce the publication of the Spring 2023 DevOps Enterprise Journal…

    Not Just for Auditors
    By Clarissa Lucas

    Scroll through my list of LinkedIn connections or the subscribers to my blog, and…

    From Checklist Auditors to Value-Driven Auditors
    By Clarissa Lucas

    Have you ever had your auditors show up with a checklist or a scope…