Your Brain Will Betray You

People are dumb. If they hear something that is unclear or ambiguous, they will hear whatever they want to hear, or whatever they are told to hear. And I don’t mean they’ll interpret “nice shoes” as a genuine comment when it was meant sarcastically; I mean people will actually hear completely different words depending on what they are expecting to hear. Even you are not immune to this.

Try this. Listen to the song embedded below. It’s Led Zeppelin’s “Stairway to Heaven” backwards. Listen particularly to the section that starts at about 4:40 on the little timer.

Unless you’ve been exposed to this before, you probably heard random backwards gibberish, with maybe a few things that sounded like real words.

Now listen to the clip below:

It’s the same song you heard before. The exact same noises reached your ears, but you probably heard completely different words than you did at the 4:40 mark in the previous clip. To prove it, go back and listen to it again if you’d like. After seeing the purported lyrics, I can’t listen to it with out hearing the Satanic message.

Of course, it’s probably not really a Satanic message. We just look for the words we were told to look for in almost-random noises, and our brains make us find them. You could probably do this with almost any song. For example:

And with this in mind, I present to you the most fucked up thing you will see all day:

Edit: More of the same here.

Magnets: Beyond Holding Things to Fridges


Some random but fascinating tidbits that I’ve learned while writing my comps today:

  • There are over fifty known sensory systems that have been identified in living things. Why, then, is a “sixth sense” seen as a far-out impossibility?
  • The genome of bacteria that can sense magnetic fields is only about 4.3 megabytes. All the information needed to create this organism could easily fit in an email attachment. The human genome is about 750 Mb. Bigger than a bacteria, but still smaller than Windows XP.
  • Magnetic structures, similar to those that allow the bacteria above to detect magnetic fields, have been found in a 4 billion year old meteorite from Mars. This is half a billion years older than the earliest known life on Earth. It suggests that the ability to detect magnetic fields may have been one of the first sensory systems to evolve, and that the ability to do so may have been brought to Earth from Mars.

While I still want to get this part of my comprehensive exams over with, it’s actually turning out to be pretty cool. My paper involves the following kickass things: Ghosts, hallucinations, Jesus, pigeon navigation, The Virgin Mary, ESP, psychokinesis, turtle navigation, mental patients, God, airplane crashes, whale suicide, lobster navigation, and now, Martians.

References:

Kirschvink, J. L., Walker, M. M., & Diebel, C. E. (2001). Magnetite-based magnetoreception. Current Opinion in Neurobiology, 11, 462-467.

Book Review: Cell, by Stephen King (Plus a Rant About Braaaaiiiins)

Stephen King has done all the typical monsters: vampires, werewolves, aliens, robots, clowns. Until now, though, he hasn’t done zombies. Cell is Stephen King doing zombies. Nothing more, nothing less.

He does, of course, add some twists to the genre, which I won’t give away here. The twists are done in context though; it’s obvious that King has seen a lot of zombie movies, and any deviation from the traditional zombie is done intentionally. His nods to zombie movies are subtle but effective (e.g. waiting for a tidy explanation of how the zombie outbreak began is missing the point). One twist sorta makes the idea of zombies less scary (for those who have read it, I’m talking about their cyclical nature), but it does keep the story moving in a believable way. The plot unfolds rapidly, almost feeling like a movie screenplay in both its pace and its visual style of writing. The bottom line is that Cell is an enjoyable read and hard to put down; that’s the highest praise I can give a book like this.

[TANGENT] There is one thing I have to complain about. At one point, a character in Cell uses the “humans use only 10% of their brains” myth to explain something. Where the hell did this come from, and why do people continue to believe it? Does anyone really think nature (or hell, God, if you prefer) would create this freakish creature with a head containing a tiny functional brain surrounded by 9 times more useless brain-coloured goo? That makes no sense. Perhaps people really mean “humans only use 10% of their brain at one time”. Closer to the truth, maybe, but the negative connotation is misleading. It’s like saying “computers use only 10% of their programs” because you never have every program running at the same time. If we “used 100% of our brains” in this context, we’d be trying to do everything a human can possibly do at one time (probably ending up paralyzed, babbling incoherently, and going insane trying to deal with all memories from our lives simultaneously rushing into consciousness); or more likely, we’d have some kind of seizure and die instantly, not unlike the computer frying itself if you managed to run every program at once.

I think the main explanation for the perpetuation of this myth is that people want it to be true. They want it to mean that we are using only 10% of our potentials, and there’s so much room for us to improve. That 90% holds the solution to all of life’s problems; we can end war, discover the universe’s secrets, and figure out the opposite sex, if only we try hard enough and dip into that 90% potential. Perhaps, though, it’d be more fruitful to realize that we’re already running at 100% (if not more) of what our brains are meant for, and if such solutions to life’s problems exist, they are already within our reach.

Oh, and another reason we want this to be true? Because if a zombie attacks and eats a chunk of your brain, ch

The Impending Robot Revolution


Below is a quote from Ray Kurzweil’s book The Singularity is Near. To put it in context: The singularity is a time when humanity as we know it will suddenly change drastically, due to advances in technology. For example, our brains will be enhanced by nonbiological computers, and we’ll spend half our time in fully immersive virtual reality. Some of the major advances that will lead to this change are what Kurzweil refers to as “GNR”, which stands not for the name of a band with a perpetually delayed album, but for “genetics, nanotechnology, and robotics.” Here is the quote:

“The most powerful impending revolution is “R”: human-level robots with their intelligence derived from our own but redesigned to far exceed human capabilities. R represents the most significant transformation, because intelligence is the most powerful “force” in the universe. Intelligence, if sufficiently advanced, is, well, smart enough to anticipate and overcome obstacles that stand in its path.”

Is it just me, or is that terrifying? This isn’t science fiction; Kurzweil actually believes this will happen in the not-to-distant future, and I’m inclined to agree with him. Yet it sounds like science fiction, and not happy utopian future science fiction, but The Matrix / Mad Max / Blade Runner / oops we destroyed the earth science fiction.

Sure, it could go either way. Maybe the obstacles standing in the path of these superhuman, superintelligent, and presumably supersized robots will be obstacles that overlap with humanity’s: global warming, crime, obesity, premature baldness. But what if their obstacles are us? We with our dull neuron-based brains and squishy bodies?

I’m sure Kurzweil has speculation on how we’ll prevent this from happening (I’m only halfway through the book). I just hope he doesn’t underestimate the human race’s ability to make extremely stupid decisions, or overlook the fact that when it comes to world-altering technology, it only takes a small group of sketchy people to get their hands on it to do great harm. Let’s hope we can overcome that stuff, though, because virtual reality would be kickass, and I do like my squishy body.

Stephen King’s Richard Bachman’s “New” Novel

I saw this book in Chapters the other day, and my eye was drawn to Stephen King’s name. Of course, this is exactly what the publishers wanted my eye to do, because everyone knows who Stephen King is, but fewer know Richard Bachman. The funny thing is, the book’s only author is Bachman, whose name you may be able to make out in tiny letters at the top. It’s only the forward that is written by King

Since when does the writer of the forward get a bigger font than the writer of the novel?

Granted, it would be less forgivable if King and Bachman were not the same person, nullifying any confusion about who wrote the book. Still, weird.

I was also surprised to see King putting out another book so soon after his last one. But it turns out that this was written in the 70s as one of the original “Bachman books”, then never released. King only rewrote Blaze recently, in addition to writing like 5 other novels from scratch. He can write books faster than I can read them.

I can’t imagine the time, motivation, and willpower it would take to write 2 or 3 novels in a year. Actually, scratch that; if I was being paid millions of dollars to live in a fancy house in Maine, and all I had to do was spooge my fantasies into a keyboard all day every day, it would take zero willpower. I’d drop everything and do that in a heartbeat. No, scratch that; in a hamster’s heartbeat.

A hamster’s heart beats over 450 times per minute!

And it’s spelled hamster, not hampster. Where does everyone get that P?

Oh, hey, maybe I should go study for my big set of exams coming up in 2 weeks instead of procrastinating by looking up animal heart rates. Unless anyone wants to offer me a novel deal for enough cash to take a few years off of school and write? I haven’t really written anything before, but I’m sure I’ll figure it out. Anyone?

Overthinking


Us psychology types are constantly reading and thinking about things like logic, experimental design, and statistics. I recently came across a nice little article, Mistakes in Experimental Design and Interpretation, that summarizes a bunch of issues in designing and interpreting science experiments.

I found the last point the most interesting:

Mistake I9: Being Too Clever

Sir R. A. Fisher (1890-1962) was one of the greatest statisticians of all time, perhaps most noted for the idea of analysis of variance. But he sullied his reputation by arguing strongly that smoking does not cause cancer. He had some sensible arguments. First, he rightfully pointed out our Mistake I7, correlation is not causation. He was clever at coming up with alternative scenarios: perhaps lung cancer causes an irritation that the patient can feel long before it can be diagnosed, such that the irritation is alleviated by smoking. Or perhaps there is some unknown common cause that leads to both cancer and a tendency to smoke. Fisher was also correct in pointing out Mistake D1, lack of randomized trials: we can’t randomly separate children at birth and force one group to smoke and the other not to. (Although we can do that with animal studies.) But he was wrong to be so dismissive of reproducible studies, in humans and animals, that showed a strong correlation, with clear medical theories explaining why smoking could cause cancer, and no good theories explaining the correlation any other way. He was wrong not to see that he may have been influenced by his own fondness for smoking a pipe, or by his libertarian objections to any interference with personal liberties, or by his employ as a consultant for the tobacco industry. Fisher died in 1962 of colon cancer (a disease that is 30% more prevalent in smokers than non-smokers). It is sad that the disease took Fisher’s life, but it is a tragedy that Fisher’s stuborness provided encouragement for the 5 million people a year who kill themselves through smoking.

It’s a nice reminder that sometimes, knowing too much can get in the way of seeing the truth that’s right in front of us, and can even be deadly. If we don’t agree with some conclusion, we can whip out all the “correlation does not equal causation”, “research is still inconclusive”, and “there was no control group” we want, but that doesn’t make the conclusion false. Deep issues concerning statistics and scientific reasoning are important, sure, but sometimes we just need to look past these trees and see the giant fucking forest that’s been there all along.

A Diatribe on the Nature of Intellect, and A Method With Which One Can Answer the Query, “Do Lycanthropes Possess Testes?”

Lately, most of my time is taken up reading for comprehensive exams. One of the topics I’m studying is intelligence. An interesting finding in this field is that raw IQ scores have been increasing over the last few decades; this is known as the “Flynn effect”, named after its main discoverer. There is some confusion over whether this is a superficial increase in IQ test scores, or a real increase in what they are meant to measure (i.e., intelligence). This leads to the following awesome quote from one of my books:

“Flynn argues that if the intergenerational gain in IQ scores were “real” (i.e. reflected g), the real-life consequences would be conspicuous. For example, the younger generation with average IQs would perceive their parents and grandparents as intellectually dull or borderline [R-word]. Flynn even suggests that baseball and cricket fans of two or three generations past wouldn’t have had enough intelligence to understand the rules of the game.”

I just find it hilarious to imagine every kid in the world coming to realize that their parents are dumber than they are. They’d get together in the playground and swap stories about how their dad couldn’t find Africa on a map, or their mom gave $30 as a 15% tip on a $100 restaurant bill. “I swear,” they’d say, “my parents are hella intellectually dull.” Then dad would take his son to the ball game, and be all like “Wha? Why’d he just swing at the ball with that there stick? Who’s them guys with the mitts? What’s going on, son?”, eating a hot dog and drooling the entire time.

But hey, maybe it’s not so far fetched. If 80s movies and South Park have taught me anything, it’s that kids are the only ones who really know what’s going on. If Dracula and the Wolfman tried to take over the world, it would be kids who would have to stop them, and not their cognitively challenged parents.

P.S. Monster Squad is finally coming out on DVD!!!!

P.P.S. The quote is from Arthur Jensen’s “The g Factor”, p. 329.

Read This or Else


So I finished my last and only exam today.

The following occured to me: In grad school, it’s not really writing exams that matters, but the threat of writing exams.

See, marks don’t really matter. In a grad course, everyone is going to get a good mark no matter how well (or poorly) they do on an exam (unless something goes horribly wrong), and marks are barely important for any future endeavors * anyway. What matters is that the students have learned the material taught in the class.

To learn the material, any motivated student will learn all they can possibly learn in preparing for an exam. But see, it’s preparing that makes them learn, not writing. So if everyone thought they were going to write an exam and thus prepared their asses off, but then it got cancelled, they will have learned just as much as if they had actually written it. The threat of writing just needs to be there, and needs to be taken seriously.

It’s like how some beetles will mimic the colouring of bees and wasps, so that potential predators will think they’re all badass and stingy and leave them alone. Or how peacocks puff up their feathers to look gigantic and attract the ladies, even though under all that fluff they’re just a shitty little runt of a bird. They achieve their purpose by threatening to accomplish something that won’t actually happen. It’s exactly the same thing.

Wait, I forget what my point is. Oh…let’s just say…the moral of this post is… “faking your way through life is nature’s way.”

* CONGRATULATIONS! You have witnessed my first usage of the word “endeavor” evor!

Nuking Dreams

This is from a recent issue of Science:

In 2004, a research team led by Pierre Maquet of the University of Liège, Belgium, used positron emission tomography (PET) to monitor brain activity in men playing a virtual-reality game in which they learned to navigate through a virtual town (actually a scene from the shoot-’em-up video game Duke Nukem). The same regions of the hippocampus that revved up when the subjects explored the virtual environment also became active when the men slipped into slow-wave sleep that night.

Well first, that’s pretty neat. I guess dreams aren’t a waste of time after all.

But: Duke Nukem?? In 2004? Assuming they meant Duke Nukem 3D and not the 1991 original, that game is almost 10 years old. By today’s standards, the graphics are horrible and unrealistic. You’d think that they would get better results using a game that resembles real life; and, um, less meaningful results with a game where you walk around a pixely city fighting cartoony 2-dimensional pig-people with a freeze ray.

Of course, if they really wanted a Duke Nukem game to use, that was their only choice. The sequel to Duke Nukem – Duke Nukem Forever – is one of the most hilarious things in the video game world. It’s been in development since 1997, and was scheduled to be released in 1998. It is still not out. It’s now almost 10 years over its scheduled release date. What’s funny is that every few years, news of the game will come out, usually saying that it will be out soon, accompanied by a tiny screenshot. Nobody will admit that it’s been cancelled. The fact that it has “forever” in the title, and abbreviates to “DNF” (i.e., did not finish) makes it even better. Still, there are plenty of games, out now, that have gorgeous graphics which psychology researchers could easily use to simulate real life. Look at Gears of War.

And this, friends, is why spending countless hours playing video games is no more a waste of time than dreaming. It’s pretty much studying for school and my future career as a psychologist.

I’ve now outed myself as a pathetic geek.

To be all intellectually honest, here is the full source of the quote in glorious APA format:

Miller, G. (2007). Hunting for meaning after midnight. Science, 315 (5817), 1360 – 1363.