Light and Dark in Daily Deals

Dealfind.com, one of those daily deal Groupon clones that everyone got sick of, often posts questionable deals. Some are only useless or frivilous (oh hi Justin Bieber tooth brush), but others are actively deceptive.

One such deal was for a “Crystal Bala Bracelet With Magnetic Hematite Beads.” While careful to avoid specific health claims, they do claim that “in Buddhism, the pañca bala, or Five Strengths are critical to the achievement of enlightenment. Now you can keep them close to you every day with the Bala Bracelet.”

How does a mere bracelet help you achieve enlightenment? Well:

“Crystals catch and refract the light every time you move [and] six beads of magnetic hematite polarize the effect of light and dark”

Sciencey yet spiritual! It must work. It’s not quite the magnetic bracelets you see at summer festivals that claim to cure cancer, but still, manipulative and deceptive.

Luckily Dealifind has a forum to clear up any misconceptions about the products, so I dug a little deeper. Here’s my conversation:

Mike (me)

Can you provide a link to the peer reviewed scientific articles supporting the claim “six beads of magnetic hematite polarize the effect of light and dark”? I’m sure they just got left off by accident. Thanks!

Amy (Dealfind Admin)

Hi Mike,

Thanks for your inquiry.
Our deal page states:

“In Buddhism, the pañca bala, or Five Strengths are critical to the achievement of enlightenment. Now you can keep them close to you every day with the Bala Bracelet. Each of the crystal-encrusted balls represents one of the bala: Faith, Energy, Mindfulness, Concentration and Wisdom. Six beads of smooth magnetic hematite provide the perfectly polarized color choice to offset the crystals.”

For more of a scientific background, please contact Widget Love at 1.800.990.6771.

Thank you!

Mike

I have to call them just to have any idea about whether or not the bracelet does what it says it does? 😦

Can you at least explain what “polarize the effect of light and dark” and “polarized color” even mean?

I want to know more about what I’m getting into before buying into this sca–…er…product. I’m afraid polarizing my dark could have serious medical effects.

Thx!

The above post was deleted shortly after posting it. Later:

Mike

Oh fiddlesticks, I think my follow-up post failed to go through so I’ll post my question again:

Can you at least explain what “polarize the effect of light and dark” and “polarized color” mean?

Thanks!

Mesha (Dealfind Admin)

Hi Mike,

Thank you for your post.

In this sense polarized means that although the colours range from one extreme to another (both dark and light) they compliment each other and the crystals.

For more of a scientific background, please contact Widget Love at 1.800.990.6771.

I hope this helps! 🙂

Mike

Ah, so it’s saying “there are black rocks and white rocks but they are both rocks.”

Thanks! That clears up everything! I’ll take 50!

That post was deleted too.

Yeah, I’m kind of just being a dick. But trying to sell people bullshit (bullshit capitalizing on the perfectly respectable religion of Buddhism) is also pretty dickish. So screw Dealfind and the dickshit company they promote. It’s just a cheap bracelet, but every penny milked from gullible people through lies is a penny too much.

Advertisement

Why Horror Movies Are Scary, and Why People Like Them Anyway

A while ago, I was contacted by a PR agency who had seen one of my talks about the psychology of horror. A British media company was putting together a Halloween marketing campaign, and wanted some advice on how to use some scariness to make it more effective. I wrote them the below summary of why people regularly expose themselves to horror. I have no idea if the campaign ever went anywhere, but I figure it makes for an interesting read, so here it is.

Why are horror movies scary?

The answer to this is less obvious than it first appears. It might seem self-evident that scary movies are scary because they have scary things in them. But that just shifts the question to “what makes things scary?” Plus, fear is, by definition, an emotional response to danger. People sitting in a comfortable chair with their friends, munching on popcorn, are in no danger. They know they are in no danger.

So why are they scared anyway?

1) Because horror movies show us things that we were born scared of. Millions of years of evolution have programmed us to be frightened by things like spiders, growling monsters, and darkness. Early people who weren’t scared of these things tended to die, so they never got a chance to be our ancestors. With the survivors’ genes in us, we can’t help but feel the fear that kept them alive.

2) Because horror movies show us things that we’ve learned to be scared of. We may not be born scared of knives, needles, or clowns, but a few bad real-life encounters with them and we learn to fear them pretty quick. Movies can take advantage of the lessons we’ve learned from being scared for real.

3) Because we get scared when people we like are scared. Horror movies show us shots of people being scared just as much as they show us what is scaring them. When we’ve grown to like a character, we can’t help but feel some empathy for them when they appear to be frightened.

4) Because filmmakers exaggerate. No matter how realistic, a scary image on a screen pales in comparison to the real thing. That is why filmmakers need to exaggerate to make up for our safety from real danger. Extra dark settings, disorienting camera angles, anticipatory music, and discordant sounds (think the violins in Psycho) all make a scary image even scarier.

5) Because our bodies tell us we’re scared. For all the reasons above, our brains and our bodies are tricked into thinking we’re really scared. Our heart rates go up, we sweat more, and we breathe faster. These bodily reactions feed back into our conscious experience of fear. Furthermore, horror movies are one of the most visceral types of film. In one study, horror was one of only two genres that had a significant and identifiable physiological response. (The other was comedy).

So why would people watch something that scares them?

Again, fear is an emotional response to danger. Usually one that makes us want to run away, or at least turn off the TV. Why would we not only keep watching a scary movie, but pay money to do it?

6) Because some people like the rush of being scared for its own sake. Studies have found that the more scared people report being during a movie, the more they enjoy it. For some fans of horror movies (but not everyone), excitement is fun, whether it’s from joy or fear. My research shows that people high in sensation seeking—who say they frequently seek out intense thrills—said they like the horror genre more than people low in sensation seeking.

7) Because some people like the relief when it’s all over. The happy moments of a horror movie can be just as important as the horrifying parts. A moment of relief after escaping the bad guy can seem even more positive than it would normally, because our hearts are still beating with excitement. The leftover emotion from being scared can translate into happiness when the source of fear is gone.

8) Because you can control your image by controlling your reactions to a horror film. In my study, even though everyone had about the same “gut reaction” to horror imagery (a negative one), what they said they liked varied a lot. People with rebellious sorts of personalities were proud to say they liked horror movies.

9) Because it helps us hook up. Although they have the same negative “gut reaction” to horror, men say they like the genre more than women. Research has supported the fact that men and women who act “appropriately” to frightening films—men being fearless and women being fearful—tend to be liked by the opposite sex more. Horror films are perfect for dates.

There you go. Just a few of the many reasons that we’re happy to be horrified.

On Lying

I recently finished reading Sam Harris’s short essay on the topic of lying, which is called, no lie, Lying. In it, he explores the rationality of communicating things that are not true, and comes to the conclusion that it is wrong to lie.

Yeah. Obviously. But Harris goes further than what many people mean when they say “it’s wrong to lie,” arguing that even seemingly justified forms of lying, like little white lies, lying to protect someone, and false encouragement, are all wrong in their own way.

He’s convincing, for the most part. Take false encouragement; the lies we tell without a second thought, like “yeah, I love your blog, you are such a good writer.” It seems harmless, and it would be awkward to say otherwise to someone, but Harris makes a good point: “False encouragement is a kind of theft: it steals time, energy, and motivation a person could put toward some other purpose.”

I’ve always been a big believer that the truth is the fastest route to success, both on a societal level (hence my interest in science) and on a personal level. It would be easy to get carried away with this, becoming one of those people who spouts his opinion whether asked for it or not, and is rarely invited to the next party. However, I think it is possible to tactfully express the truth whenever asked to.

I appreciate blunt people. Others may not, but even they can be served well by the right kind of bluntness. If I tell you that yes, you actually do look like a giant turd in that brown dress (like really, brown dress? What were you thinking?), it might hurt at first, but when you show up to the party in a different dress and get genuine compliments rather than awkward false encouragement, you’re better off in the long run.

Harris also makes the point that lying is not only harmful to the people being lied to, but taxing for the liar. Keeping up a lie takes a lot of mental effort, since the lie was fabricated in the liar’s mind. Every time the lie comes up, the liar has to check against his memory of previous lies, who knows what, how the lie affects everything else; he essentially has to store a new version of reality entirely in his head, often fabricated in real-time. When the truth comes up, though, it’s easy to keep track of; the truth-teller only has to keep track of one version of reality. The real one.

Many of these examples assume the people involved are regular, sane people, who ultimately just want to get along. Where Harris starts to lose me is when discussing situations where this arrangement breaks down. He discusses a hypothetical situation of a murderer showing up at your door looking for a little boy who you are sheltering. Should you tell the murderer the truth? Harris argues that lying could have unintended harmful consequences; the murderer might go to the next house and murder someone else, or at best, it just shifts the burden of dealing with the murderer to someone else. Instead, a truth like “I wouldn’t tell you even if I knew,” coupled with a threat, could mollify the situation without a lie.

I’d argue that, when facing someone for whom cooperation and rationality have obviously broken down (e.g., a kid murderer), sometimes there are known consequences of lying (e.g., saving a kid’s life) that are almost certainly less harmful than far-fetched unknown consequences. Harris later makes this same point on a larger scale, when justifying lying in the context of war and espionage, saying the usual rules of cooperation no longer apply. I think blowing up a city with a bomb and stabbing a kid with a knife are both situations where cooperation has broken down, and both situations where lying can be a tool used in good conscience.

There are no absolute moral principles that work in all situations. Life is too complicated for that. Trying to summarize it in simple prescriptive rules (as many religions have) doesn’t work. So, the rule “lying is always wrong” can’t work. There are extreme situations where the rule breaks down.

Luckily, most people will never encounter such an extreme situation in their daily lives. This is where Harris’s main point is spot on: we should lie a lot less than we do. If everyone told the truth in every normal situation, relationships would be stronger, and people would be happier and more productive. I’ve certainly been more aware of my honesty since reading the book, so it’s fair to say it literally changed my life. That’s certainly worth the $2.00 it costs (buy it here). No word of a lie.

iBooks, eBooks, and Episodic Writing

Apple announced a new version of iBooks for iPad a few weeks ago, focusing on how it can deliver inexpensive textbooks to students. It’s being pushed as a revolution in education, but does the same update have applicability outside of the classroom?

Aside from the (often gimmicky) interactive widgets and other benefits of electronic books, they offer “current” as another advantage of electronic books. The main idea here, in the context of textbooks, is that a new edition can be distributed inexpensively, without the need to buy a new 5-pound $300 book every year. I see potential for another use: episodic fiction.

Serial publishing is not new. When advances in technology and economy allowed magazines to be widely distributed in the 19th century, it was popular for authors to release long works in short segments. As magazines shifted their focus away from episodic fiction and television replaced that niche, the idea of a serialized work of text started to die (with occasional exceptions, like Stephen King’s The Green Mile). Today, we’re facing more leaps in technology and in the economics of distribution that, I think, have potential to bring serial fiction back.

Imagine this: you hear about an author releasing a story with an intriguing premise. You download the first “episode,” then every, say, Wednesday, you get a notification alerting you that a new episode is out. Either for a small fee per episode (99 cents seems fair) or a flat “season pass,” you get new content every Wednesday for a few months, automatically updated and waiting for you when you open iBooks.

I’m not sure if this is how iBooks currently works (the new textbook stuff, as usual, locks out Canadians), but they seem to be going in that direction with the “books as apps” model. It’s not unique to Apple, either; the same thing could easily be implemented on any other e-reader with minor tweaks. It’s been attempted, but Apple’s app model demonstrates how streamlined it could be1. And in a generation that often prefers TV to movies and Twitter to blogs, maybe we’re ready for bite-sized fiction’s big comeback.

Would you buy a book that updates itself with new content every week? Really, I’m asking, because I have a few stories in the file drawer, and I’m seriously considering experimenting to try turning these tumultuous times into something awesome.


1 Note that Apple’s new updates come with a giant catch: a ridiculous license agreement. The main problem is that if you use iBooks Author to create a work, you can only sell that work through iTunes. It’s equivalent to buying a guitar, then finding an attached note saying you can only sell your music through Gibson’s store. Ridiculous. Hopefully this gets changed, or people realize simple workarounds (change one word in the file using different software; tada! All-new work that can be sold wherever you want).

Book Review: Moonwalking With Einstein, by Joshua Foer

Memory is often taken for granted in a world where paper and transistors store information better than neurons ever could. Moonwalking With Einstein shines a much-needed light on the art of memorization. It could have been a dry collection of basic science and light philosophy on the subject, but Foer makes it riveting by telling the story of his own head-first dive into the world of memory as sport.

I had no idea this went on, but every year, there are regional and worldwide memory championships in which people compete to perform seemingly superhuman feats of memory, such as memorizing decks of cards as fast as possible, or recalling hundreds of random numbers. After covering one of these events, Foer became so curious that he began training to participate himself.

What he discovered is that these impressive acts of memorization actually boil down to a few simple tricks that anyone can learn. While not a how-to manual, the tricks are simple enough that anyone can pick them up just by reading about how Foer learned them. I can still recall a list of 15 unusual items (in order) that Foer’s mentor, Ed Cooke, used to first teach the memory palace technique. It’s only a matter of practice and refinement for anyone, no matter how forgetful, to memorize several decks of cards.

This humanization of the extraordinary carries throughout the book. Foer himself keeps a modest tone about his damn impressive accomplishments, emphasizing that he’s just a regular forgetful dude who lives in his parents’ basement. The other memory championship contestants, too, can do amazing things during the contest, but it’s clear that the ability to memorize a poem doesn’t translate to a successful personal life.

In fact, Foer is critical of those who do profit from using memory tricks. His contempt for Tony Buzan, the entrepreneur who makes millions on books and sessions related to memory, comes through every time Buzan’s name comes up. He might as well add “coughBULLSHITcough” after every claim of Buzan’s. More substantially, a tangent on savantism takes a strange turn when Foer begins to suspect that one self-proclaimed1 memory savant, Daniel Tammet, may have more in common with the memory championship contestants than with Rain Man2. When Foer confronts him about it directly, things get a bit uncomfortable.

By wrapping fascinating facts and anecdotes about memory up with his own story, Foer keeps it riveting throughout. This is one of those books that I literally had trouble putting down. Anyone with even a passing interest in the human mind should remember to stick Moonwalking With Einstein in their brain hole.


1 And expert-proclaimed; psychologist Simon Baron-Cohen (yes relation) studied Tammet and was more convinced of his traditional savantism.

2 The inspiration for Rain Man, Kim Peek, also makes an appearance and is more convincing as having freakish memory naturally.

Book Review: The Hunger Games, by Suzanne Collins

[Very minor spoilers lie ahead]

The Hunger Games tells the story of a teen girl living in a future, post-post-apocalyptic world, where an obviously-evil government keeps the people in line by throwing a handful of them into a televised fight to the death in an outdoor arena once a year. Although it doesn’t happen right away (more on that later), it’s pretty easy to guess that she gets involved in the titular Hunger Games.

The concept may sound like a sci-fi trope, but Collins does a good job of painting a world that feels unique despite borrowing pieces from other stories in its genres. The first third or so of the book is mostly setup for the inevitable beginning of the Games. It could’ve been boring, knowing the story is taking its time to begin, but it’s interesting enough due to the colourful character development, world-building, and writing style.

Then the action kicks into gear, and something odd happens. The writing quality drops immediately the moment the Hunger Games actually begin. It’s as if the latter two-thirds were written by another author (or an author who wrote the first chapters years after the latter ones). What begins as straightforward YA-level prose begins sounding like a teenager’s blog. Rambling tangents come back with “anyway”, ellipses replace proper punctuation, and there are outright typos. I half-expected sentences to start ending in “lol.”

It’s not too distracting, and makes some sense given the first-person narrator’s age, but the fluctuation in style was a bit jarring.

Anyway, the story itself is about what you’d expect given the premise. There is some mild satire of reality television and some mild violence (but this ain’t no Battle Royale). Some unexpected twists have impact, but some expected showdowns are a letdown. Maybe it’s a further subtle bit of satire to have some of the major plot points happen “off camera,” but it’s anticlimactic storytelling. Despite my pickiness, it’s a good story that’s often hard to put down, and anyone who’s up with the premise would enjoy it. I’m looking forward to the movie.

This was also the first book I read on a Kindle. I have mixed feelings about that. On one hand, it was a joy being able to sit under a tree in the sunshine (yeah, took me a while to get around to this review) with the tiny Kindle in one hand, flipping pages with the push of a button. On the other, when I wanted to skim the book while writing this review, I couldn’t. And what if I wanted to come back to it in 10 years? If technology changes too much, or Amazon bites the bullet, or I get another company’s incompatible device, my DRM-infected e-book is lost.

I’ll probably only use the Kindle for cheap books I’ll never want to read again. Hunger Games fits that bill.

On Complaining About Technology

I don’t complain much, but when I do, it’s usually about technology. I have an unfortunate combination of bad luck and high standards when it comes to gadgets. Literally, whenever I buy anything with greater complexity than lettuce, it has some flaw, either minor and only noticeable to my hyper-critical eye1 or a major defects2. At least nothing has outright exploded, though not everyone is so lucky (see: iPhone spontaneously combusts aboard flight). It’s tempting to become a cynical old ass, shaking my wooden stick (not a microchip in it!) and grumbling about how quality control has gone down the stinker and nothing works like it should. I’ve certainly given into that temptation a few times.

But think of it this way:

We are a bunch of animals. We were crafted by nature to root around in the dirt, find food, then go home and screw. Yet we’ve taken some of that dirt and, with nothing more than our grubby hands and abnormally large brains, we’ve made tubes of steel that can fly us through the air. We’ve burned sand until we have a slab of glass that allows us to have food delivered to us by poking at it. “Technology” isn’t some mysterious black toaster that pops out perfect gleaming gadgets. We’re literally grabbing whatever imperfect raw materials we find lying around the planet, and sticking them into arrangements that accomplish things no other animal can fathom.

Arthur C. Clarke said that any sufficiently advanced technology is indistinguishable from magic. But the people crafting our gadgets (and crafting the machines that craft our gadgets) are not magicians. They are humans, animals, maybe smarter than you or I, but only by a little bit. They’re working with limited time and limited resources. They get tired. Sometimes the best they can do is try pretty hard, and hope it’s good enough.

Thinking of it this way, it’s odd to complain.

Maybe a dead pixel on my phone’s screen “shouldn’t” be there, but a bunch of strangers managed to get me 614,399 working pixels that beam all of humanity’s accomplishments directly into my eyes. I can probably live with it. I can probably manage to enjoy everything that works despite the small parts that don’t.

There are exceptions; technologies that are defective by design (e.g., DRM, planned obsolescence) are inexcusable. The average gadget works pretty well though, and although it’s not terrible to strive for perfection, I also need a moment to shut up and revel in the the awesomeness of the imperfect magic people have managed to weave.


1 E.g., backlight bleed on my iPad 2, suboptimal battery life on my Kindle, phosphor trails on my TV.

2 E.g., cutting out audio on my iPhone 4S, a computer with a display that occasionally turns to grey fuzz, six dead Xbox 360s.

Review: iOS 5 = Win, iCloud = Fail

Yesterday, Apple released an update to its iPhone/iPad operating system, iOS 5, alongside its new backup/syncing/interweb thing, iCloud.

iOS 5

iOS 5 is beautiful. You can read a complete list of new stuff elsewhere, but some of the new features really improve the devices that iOS powers. Notifications were a useless mess before, but now they’re unobtrusive and easily accessible. Syncing without wires is similarly long overdue; now we just need wireless power and rat’s nests of cables will be a thing of the past.

The split keyboard on the iPad is a nice option, but will take getting used to. Another iPad feature that nobody is talking about, for some reason, is multitasking gestures. Having to double-click the home button just to switch apps was getting pretty ridiculous. Now clawing at it with four fingers will do the trick. The iPad and iPhone are much better devices with iOS 5.

iCloud

Unfortunately, I’m less impressed by Apple’s attempts to extend these devices to the cloud.

The promise of iCloud is amazing. You can update something on one device—add a contact, or work on an iWork document, or take a picture, or start a conversation, or buy a song—and it will automagically appear on all your devices.

For some things, this works beautifully. I cleaned up my contacts on my MacBook today, and without doing anything else, or even plugging anything in, they’re cleaned up on my iPad, iPhone, and on iCloud.com.

Photo Stream

For other things, there are seemingly small issues that end up being dealbreakers. One new feature is that all photos are automatically published everywhere, viewable on any device with Photo Stream. While mildly creepy, I’m fine with that after I opt in for it. What’s not fine is being unable to delete any photos after you take them. Seriously. If you accidentally (or purposefully) take a crappy photo, it will be on every device forever. You can turn off Photo Stream and delete every photo, but you can’t delete just that one photo that was meant to be texted then discarded.

This is so ridiculous that it’s almost as if they released iCloud without it by accident. I’m guessing it’ll be fixed pretty soon, but it’s dumb to have left it out initially. Maybe Apple’s just waiting for a scandal to drum up free advertising.

iMessage fails at carrying across devices

iMessage

iMessage isn’t technically part of iCloud, but it’s sending messages over the internet, so maybe it should be. iMessage mysteriously detects whether the person you’re texting has an iOS device, and if so, sends them a message via data rather than text messaging. It’s very cool if you don’t have an unlimited texting plan, or drop out of cell phone coverage a lot. Even cooler is that it works on the iPad and iPod touch as well, so you can finally text from them. Since they don’t have phone numbers, you set up an email address to iMessage with. There is also the promise of being able to start a conversation on one device, but continue it on whatever other device you switch to.

Where it fails is that, from what I’ve seen so far, it doesn’t deliver on that promise. iMessage doesn’t associate your phone number with your email address(es), so if you’re texting between phone numbers on an iPhone, then switch to an iPad, now you need to start a new conversation using the email address you set up on the iPad. You could use the email address the whole time, but that defeats the purpose of making it a texting alternative; coordinating this with people will be almost as bad as having to exchange PINs on BlackBerrys. This messy confusion sinks what was supposed to be a simplification of messaging.

Update Oct 15: I’ve managed to unify all my messages, at least with one person. It took some combination of the following: both of us added all of each other’s iMessage phone numbers and email addresses to the same contact. We also made sure our “caller ID” (which has nothing to do with calling) in the message settings was the same on all of our devices (i.e., on an iPhone, it has to be changed to an email address instead of a phone number). Now, all communications show up in the same conversation, which syncs across all devices, just as promised. It’s even smart enough to know that if you see the conversation on your unlocked iPad, it doesn’t need to bother alerting you on your iPhone too. Cool. So it’s possible to achieve the promise of iMessage, but it takes a lot of fiddling and coordination between the two people, and it’s still not really clear how to do it.

iWork in the Cloud

I’m wondering why more people aren’t complaining about this next problem. I think most early reviewers just assumed this wouldn’t be a problem, and didn’t actually try it.

The aspect of iCloud I was most excited about is the ability to work on a document from any device, and have it always be synced up between devices, automatically. Currently, this works wonderfully for syncing a document between iOS devices. So, you can type up something on your iPad, and it will automatically show up on your…uh…iPhone I guess? But why the hell would you want to do word processing on an iPhone?

Yeah. There is no way to have iCloud sync documents with an actual computer. It syncs photos and contacts just fine with OSX, but nooo, creating documents, the one thing that’s still done best with a large screen and a keyboard, you can’t do that on a computer.

I’m sure this feature is coming, but I’m baffled as to why this, what I think is the most useful application of iCloud, wasn’t a priority to get out right away. As it is, iCloud is nothing more than an automatic backup for iWork documents.

But Still…

I don’t wanna get into #firstworldproblem territory by complaining about nitpicky details in the OS of a supercomputer that I can carry around in my pocket. But still, these are some odd omissions among software that is so improved in every other way. I didn’t see these problems addressed elsewhere, so I thought I’d get this up on the internet for other complainers to find. It’ll probably all be fixed tomorrow, and then I’ll be back to blissful Apple fanboyism.

Evolution’s Failures

I think it’s hilarious to imagine evolution’s failures.

Think of how our digestive systems are able to function no matter which way we’re sitting or lying, carrying food to the right place in a peristaltic wave, even if it’s going against gravity. Think of the pre-human who didn’t get that gene. He’s all like, “check out this handstand!”, then as soon as he’s upside-down, all the wooly mammoth he ate earlier is pouring out of his face. He suffocates, dying before he ever had a chance to procreate, and his shitty genes never get passed on. Hilarious.

Thing is, one day that guy will be us.

Evolution is not only biological, but technological. We already pity the people of the past—most of human history—who didn’t expect to live past the age of thirty. Technology has doubled our lifespan just by tuning up our default biological hardware from the outside. Think of what we can do once technology moves inside.

It’s a near certainty that we will merge with technology. We already rely on it, and there’s gotta be a better way of interacting with it than through our fingers. When our brains and bodies are made more of bits and bytes than nerves and leukocytes, the people of today will be the pre-humans.

Looking back, we’ll think that our squishy biological way of doing things was hilarious. “That’s right son,” we’ll say, to our sons. “We had computers we plugged into walls, but our own method of recharging was—hah, it’s so gross, but get this—we mashed up other living things with our teeth then let them slide down our throat. There were actually people who couldn’t find things to eat, and they died. Forever! They didn’t even have a backup.”

And our sons, they probably won’t even understand how (or why) we managed to get through the day.

Evolution makes failures of us all.

The Myth of the Evil Genius

Joker by Nebezial

The evil genius only exists in fiction.

An evil genius cannot exist in reality, because in reality, intelligence and evil are incompatible. A genius acts rationally, and history constantly proves that it is rational to be good.

Genius and evil are two terms that are nearly impossible to define, but most people know it when they see it. Adolf Hitler was evil. Osama Bin Laden was probably evil. Albert Einstein was a genius. Bill Gates is probably one too.

It’s not that evil doesn’t pay; genius and evil both pay, in some sense. Bill and Osama both have mansions, and could probably afford the most expensive bacon at the grocery store (though I guess Osama would pass). The difference is that Bill is living a comfortable life that leaves a trail of advancements and improved lives. Osama is at the bottom of the ocean riddled with bullets, and has left a trail of destruction and ruined lives.

Osama and Adolf did gain power, but was it through genius? I doubt it. They excelled in some areas—charisma, mostly, and probably a good helping of being in the right place at the right time—but I doubt they were geniuses. Not in the sense meant here: extreme mental ability for coming to correct conclusions.

On both an individual and a societal level, it is rational to be good. More often than not, the correct choice between a good option and an evil option is the good option, all things considered. Murdering a person you can’t stand may be easier than altering your own life to get away from him (say, packing up and moving away), but on an individual level, murder will probably put you in jail or dead yourself, and on a societal level, allowing people to murder willy-nilly wouldn’t be conducive to happiness and productivity.

That’s why the evil genius doesn’t exist. Even if the impulse to do evil was there, a true genius would take a moment, and think “hmm, considering all the consequences, maybe genocide isn’t such a spiffy idea.” If The Joker was really so smart, he’d figure out a way to resolve his Batman problem without blowing up innocent people and getting thrown in Arkham again and again.

Evil cannot result from the cool calculated machinations of a genius. In real life, evil is in the hot passion of an argument when a knife is nearby. It’s in the subtle biases of a politician whose values are misguided. And in that sense, evil is in all of us; luckily we also have an inner genius to play superhero.