Tuesday, May 31, 2011

Google Drops Legal Emulators to Appease Copyright Holders

I’m a big fan of retro gaming. Not because I think it’s cool or because the games were especially good, but because there’s such an enormous wealth of games available that have been made in the past. (I’ll admit, there’s a strong sense of nostalgia in there too.)

But other than that, there’s something particularly satisfying about the simplicity behind these old games. From plain text adventure games to 2-D platformers, there’s not much you need to know about the games in order to play them. This way you don’t need to invest a large amount of brainpower into playing a game; you just need to hit buttons and have fun.

I’m not the only one. There’s an ever-growing community of people who are discovering and re-discovering the seemingly endless back-catalog of old games and the entertainment they provide—especially now that most of us carry little computers in our pockets. More about that in a minute.

Big game companies have recognized the potential these old games have for profitability. Dozens of websites feature the ability to play these retro games, from arcade classics to Sonic the Hedgehog and beyond. Flash technology allows us to recreate and play the games of our childhood from any modern computer, albeit with occasional glitches from improper porting.

Game companies, most notably Sega (the makers of that ultra-profitable Sonic series), have finally caught on to this and realized that their entire catalog, even the oldest and most outdated games, are still worth something. Not the hardware or the merchandise—just the programming. These companies are finding ways to re-release these games to the public at affordable prices that rarely exceed $5. Nintendo’s Wii allows you to download and install games from several different consoles, including the Sega Genesis and the TurboGrafx-16.

Being overly nostalgic and a bit rom-savvy, one of the first things I did when I got my G1 phone was find out if I could run an NES emulator on it, and sure enough, I can. Not only that, but I’m running a C-64 emulator, an Atari 2600 emulator, and even ScummVM, an application that assists in the loading of LucasArts video games like The Secret of Monkey Island. Normally I don’t install applications that I have to pay for, but all of this was worth $3 per program, and ScummVM was, and still is, free.

My thinking: Who wants Solitaire when you can play Battletoads while waiting for your to-go order at the local deli? Apparently a lot of other people feel the same way, which is exactly why the emulators are so popular. Actually, a lot of those people are in the industry themselves.

And those people up in the industry just got Google to pull most of those emulators out of the Google marketplace, so you can no longer download them through that method. However, you can still manually install the emulators on your phone anyway, so this is really just a way for Google to say “Not my problem!” But why would this happen? Why would Google decide to disallow emulators in their marketplace just now?

It probably has nothing to do with the retro-gaming boom. It’s probably not related at all to Nintendo selling old games through Wiiware. I doubt it’s connected in some manner to the brand new Sony Ericsson Xperia Play gameplay-oriented mobile phone. Couldn’t be. Google must have just now decided to pull emulators just because they don’t like them.

Is the problem that emulators are illegal? Nope, because they’re not. No more illegal than a DVD player, even though you can put copied DVDs in it.

Is it that the ROMs are illegal? Possibly, but that still shouldn’t have any impact over whether or not the emulator should be available. Since its launch, Google’s market has been praised for its openness. Anyone can submit just about anything, and the best just rises to the top of the ranks. Very few apps have been pulled and the developers banned.

Do the game makers care about ROMs being traded freely on the Internet? Yes, but they’ve historically ignored it since the games haven’t been profitable for years. In other words, trading ROMs on the Internet didn’t translate into a loss of revenue—until recently.

Now, with the popularity of emulated gaming on mobile phones on the rise, these companies want to take back their intellectual property. Nintendo’s got a lengthy writeup on their website explaining their position on the topic, including this section on the legality of the possession of ROMs:

There is a good deal of misinformation on the Internet regarding the backup/archival copy exception. It is not a "second copy" rule and is often mistakenly cited for the proposition that if you have one lawful copy of a copyrighted work, you are entitled to have a second copy of the copyrighted work even if that second copy is an infringing copy. The backup/archival copy exception is a very narrow limitation relating to a copy being made by the rightful owner of an authentic game to ensure he or she has one in the event of damage or destruction of the authentic. Therefore, whether you have an authentic game or not, or whether you have possession of a Nintendo ROM for a limited amount of time, i.e. 24 hours, it is illegal to download and play a Nintendo ROM from the Internet.

I’m a supporter of the point of view that I can own a copy of particular media if I’ve paid for the content or if I’m currently a subscriber to a service that allows me access to the media. Therefore, if I’ve paid for Battletoads, I should be able to put a port of Battletoads onto my phone and play it anywhere I go, right? Wrong, according to Nintendo.

I understand their interpretation. They are pointing to the limitations of court cases which say that a single copy can be made by the owner for the sole purpose of protecting the owner from loss of the media in case the original is rendered unusable. This came about when MP3 rippers became popular, and people were using them to make copies of the CDs they’d buy so that they could protect the original in a bulletproof CD case while the expendable CD-R went in the car, ready to fall out of a visor disc holder and crack on the center console. Of course, most people were using MP3 rippers to copy their friends’ CDs.

As DVDs became more popular, this school of thought expanded to include the same protection for copies of that format of media as well. Therefore, I see no reason why this viewpoint can’t include video games as well. In 1994, my next door neighbors moved away at the same time that my Game Genie and eight of my favorite NES games disappeared forever, including TMNT 2: The Arcade Game. I was the owner of that content; should I not be allowed to download the ROMs and play the games from my childhood without having to pay for them again?

Nintendo’s opinion is that I should pay them for games from my youth. It’s like my parents paying to watch seasons of Mork and Mindy on DVD: Just because it’s nostalgic doesn’t mean you get it for free. And since you didn’t take your original NES cartridge of TMNT 2 and pull all the sprites, music sequences, sound effects, and programming and reassamble them yourself, you didn’t make the copy. And since you didn’t make the copy, it’s illegal.

But hey, those legal CD copies? I had to use tools to make them happen. No such tool exists for ripping a cartridge. Someone else making the ROM takes the place of me using an MP3 ripping program. Therefore, it’s legal if I own the game.

So why is Nintendo so defensive? Here’s their explanation:

The introduction of emulators created to play illegally copied Nintendo software represents the greatest threat to date to the intellectual property rights of video game developers. As is the case with any business or industry, when its products become available for free, the revenue stream supporting that industry is threatened. Such emulators have the potential to significantly damage a worldwide entertainment software industry which generates over $15 billion annually, and tens of thousands of jobs.

Hey, I get it. Nintendo’s making $5 a piece selling Wiiware versions of these games from my childhood, and they want to get into the mobile phone scene as well. If I can just play these games for free with my legally paid-for emulator, then that’s a big problem for them.

The real offender here is Google, who have decided to change the rules. Now emulators are seen as a bad thing, a criminal’s tool. They can do this, since they own the entire system and have reserved the right to change the rules whenever necessary. However, this is just another chapter in the digital rights battle.

There’s a lot of grey area here, and we’re going to need to define the legality much more clearly in the near future. Google probably shouldn’t have dropped the emulators from their market, but it doesn’t change a thing. Only one thing’s for certain: We’ve finally reached a point where retro gaming is as nostalgic as retro-television.

Sunday, May 29, 2011

We Need a New Birthday Song

Music is powerful. It's amazing how few notes it takes to raise goosebumps on your skin, whether it's a really catchy melody or an anxiety-inducing dissonance. For example, the foreshadowing two-note refrain from Jaws, or the piercing shriek of the violins from Psycho. Both can make your hair stand on end—but neither are as disturbing, as morbidly virose as the Birthday Song.

You've heard it. You know exactly what I'm talking about. It's obligatorily sung by nearly every American many times throughout the year, and as we all know, not everyone has a good singing voice. Couple this with the usual lack of a predetermined starting note so that singers have no idea in which key to sing, or at what tempo to sing it. The result is a cacophony of disparate voices disturbingly serenading the poor bastard who's celebrating another trip around the sun in 10-part disharmony.

And the words aren't always the same. The lyrics feature a fill-in-the-blank like the world's worst Mad Lib wherein the victim's name, nickname, moniker, or relationship title will be inserted. Considering that none of the singers ever coordinate this ahead of time, this moment in the song usually results in awkward laughter as half the chorus switches what they're singing mid-word, embarassed for having sung "Eleanor" instead of "Grandma."

I cringe every time I hear it. Uninterested in contributing to the creepy crooning, I'm likely to silently mouth the words. This allows me to appear compassionate and social while sparing myself and others from further destroying the sonic spectrum in the room.

The tune and its meaning are sung in at least forty countries worldwide, as far apart as Switzerland and China. It's likely to be one of the most recognized melodies in human history, and many wish to hold onto it for tradition. Because the notation and words first appeared in print in 1912, the song is reaching its official 100-year anniversary next year which gives us just a few short months to find a suitable replacement to send it off into a centenarian's retirement. Some suggestions already exist.

A couple of British folk who penned songs under the name Lennon/McCartney wrote their take on the song as the lead-off to the second disc on the Beatles' eponymous white-sleeved release. "Birthday" was a song that was not originally intended to take over party duties as traditional song, but before they knew it, Beatlemania had pushed the song into heavy rotation at celebrations worldwide. The song was based on earlier attempt at revolutionizing the birthday song, a 1957 tune entitled "Happy, Happy Birthday Baby." This earlier hit, unfortunately, did not catch on.

Imagine singing that fifty times a year. It's no wonder that we haven't replaced the old-fashioned melody: It's short, it's simple for even small children to sing, and the lyrics are repetetive. Thus, this suggestion by Shake from Aqua Teen Hunger Force didn't catch on:

Even the title, "Spirit Journey Formation Anniversary," would be impossible for small children to say, but at least it features sweet artificial harmonics. Looks like the Beatles' submission is winning so far.

Actually, it was popular already when I was a kid. To pump up the excitement and throw children into a quarter-fueled frenzy at arcade/pizza restaurant combos, the Showbiz Pizza house band The Rock-a-fire Explosion belted out the song with the help of Billy Bob, the creepy animatronic bear.

I remember sitting in a Showbiz Pizza as a small child and wondering how it could be everyone in the band's "birthday too, yeah." I still don't understand that line, but Paul McCartney was on a significant quantity of drugs around the time this crazy idea for a song materialized anyway. Hey kids, let's sing a song written on drugs!

Some restaurants, either to avoid monotony or to increase quirkiness, have penned their own birthday songs. These range from embarassing to annoying, and don't promise to be suitable replacements. Recently, as I sat in a Longhorn Steakhouse in Brunswick, Georgia, the entire staff piled out of the kitchen not once, but three times to sing this gem to dinner guests (performed in a chanting style while clapping hands):

We can't sing, so here's our thing
Okra, fries, sweet sweet tea
Have a happy birth-a-day!

Although this song wins in my opinion based solely on its brevity, I do hold a soft spot in my heart for the traditional song. As a very young child, too young to remember it myself, my mom performed good night songs for me.

"What would you like me to sing?" she would ask.

"Happy birthday!" I yelled every time, and she'd oblige. "Again!" I would shriek when it was done. My mother assumed that I really, really liked the song, but I feel that it was more of the sleepy imagery of presents and cake that the melody conjured. It's just too bad that the same song doesn't inspire such sweet daydreams instead of the vivid nightmares I now experience.

Saturday, May 28, 2011

Hulu Plus on a TiVo Premiere: Not Worth $8

In 2008 my brother gave me his old TiVo—and "old" is an accurate term, because the thing felt ancient; fortunately, TiVos remain relevant and usable for a long time, and it was free. We used it without major problems for two years before the thing decided it was time to give up, and the power supply died.

Calling TiVo for support was a good idea. Because we were "long-time customers," they gave us a good deal on a TiVo Premiere, their newest model. I bought it over the phone and had it set up within 4 days.

Because the Premiere (a Series 4 unit) was marketed as having "double the processing power of the Series 3," I was excited. The old box took 7 minutes to boot, and the menus lagged significantly. Unfortunately, first boot—and all subsequent boots—was no less than ten minutes long. Not a good sign.

Comcast is charging me $78 a month for highly compressed video, which is painfully blocky on my HD TV, so when I heard that Hulu Plus had just become available for the TiVo Premiere I felt the urge to cancel my cable. I don't have another device that supports Hulu Plus, so this was my chance to try the service, which, at $8 a month, would be an unbelievable bargain by comparison.

I broke into my old college email account by guessing a password I thought I had forgotten to get the "30-day trial when signing up from a .edu email address," and Hulu was mine to use. Now I just had to set it up on my Premiere.

First I had to perform a firmware update to get the Hulu interface to appear on the TiVo, which took literally all night. Sometime while I was sleeping it finished and rebooted, so the next day I was ready to jump into it.

Wow, what a laggy interface! Moving the cursor within the system shows a delay between command and action that can't be less than 1000ms, and pressing one direction repeatedly queues commands that, each lagging by at least one second, might not end for half a minute or longer. By this point you've watched your show come and go in the list without being able to stop the queue of actions to select it.

Found your program? Great, now you can watch it. First, wait 20 or so seconds for the Hulu service to retrieve your show; now sit through two initial commercials. With that out of the way, you can watch the first few minutes of the program before, suddenly, another commercial breaks in.

Want to fast-forward to a funny part of the episode? Well, if you've bypassed a designated commercial break, expect 60 seconds of ads before you can move on. Finally, the Hulu feed comes through, but the audio is missing for the first 5-8 seconds of video.

And what about the quality? Well, I noticed that it was even blockier than Comcast's cable quality, but I expected that from a streaming service. However, I have a 15 mbps connection, so I should be able to handle a 1080p stream. Looking in the menu, I noticed a box that said "SD"—standard definition. Okay, I'll just click on that and change it to "HD," right?

Select a stream rate:

Oh, okay, it's either this or lower quality. Remember, I have a 15 mbit/s connection, and my highest quality option is a stream of one.

In addition to this, shows not filmed in widescreen came in at the wrong aspect ratio; instead of 4:3, Hulu was streaming me a signal that looked more like 1:1. Not only did this cause giant black bars on the left and right sides of the video, but also stretched everything on the screen vertically. As of now, I have not been able to fix the aspect ratio.

I gave up and walked to my brother's apartment where I watched him use Hulu Plus on a PS3. The interface was smooth and high quality, and there was no visible lag. Of course, the same content still came in at SD quality, but that's Hulu, not the media device.

I was highly disappointed in my "brand new" TiVo Premiere. Not only is it severely lacking in the hardware department, but it's late to the Internet television scene. It's no wonder that it took so long for the service to come to TiVo: The programmers probably had a hell of a time figuring out how to make the interface usable, eventually giving up and releasing the best thing the system could handle—which is still subpar, by far.

I would definitely not recommend the Premiere + Hulu combination. It feels much more like an afterthought than a carefully planned business move, an attempt to stay viable in a quickly changing television delivery market. If you're looking for an alternative to cable, please just buy a Logitech Revue and sign up for Google TV. It'll be the best option out there if more of us sign up for it.

Friday, May 27, 2011

Spanish Revolutionaries Create Flash Mobs of Political Activism

In the wake of numerous revolutions around the globe sparked mostly by youth and energized by social networks, Spain’s citizens have been undergoing a cultural revolution in the past few weeks.

Dubbed #spanishrevolution, which you might recognize as a Twitter hash tag, the movement has gained momentum due to outrageous unemployment rates, bank-friendly mortgage systems, and politicians who seem uninterested or unable to deal with the problems at hand. The eponymous tag serves as a meeting point for revolutionaries looking for information related to the current status of the protests.

The hash tag was collectively realized by bloggers in Spain who write politically-motivated critique. With tons of readers who trust the bloggers for their credibility on the topics, protests can be organized via the hash tags and then disseminated via the blogs. What this amounts to is a large population of tech-savvy youth—45% of whom are unemployed—who are ready to meet in public and get their protest on.

Being unemployed has a two-fold effect here: First, the youth are motivated to protest, since they don’t have a way to generate income, and money is kind of important. Second, they don’t have jobs, so they’ve got plenty of free time! This situation clearly underlines the difference between unemployed youth in Spain and the United States, as well as the relationship between blogging and physical action in the two countries.

Those that are unemployed in Spain appear to be motivated to go and accomplish something, where a lot of the unemployed youth in the United States are content to complain and live with their parents. The Spanish may not have jobs, but they’re taking to the streets, using their social media-fueled communication to organize and align themselves to meet up in specific areas.

Both countries have a countless number of political bloggers who intend to incite momentum within their populations, but the United States seems to have an overwhelming number of people who merely write about what they’re campaigning for rather than taking further action. I’ve long lamented that we’ve got too many people who are just barely motivated enough to write something and leave it up to everyone else to perform the action. The problem is that there is far too high of a ratio between bloggers and activists.

The Spanish youth, however, are using these tools as a way to quickly organize protests and gather large groups of people to attempt to accomplish something or at least have their voices heard. The result is a flash mob of political activism—which ends up looking something like this:

This event occurred earlier this morning in Barcelona, Spain. As you can see, the police didn’t take kindly to the protesters sitting down where they’re not supposed to, and as we’ve seen repeatedly in situations where those in authority are highly outnumbered, they begin to get violent. They beat the protesters with hardly more than verbal taunts being thrown at them.

Americans stage these kind of protests occasionally, but generally without success, and it’s not entirely clear whether the protests in Spain are having a major impact. It is true that a large segment of the population is involved in the debate, with most people aligning against the corrupt or incompetent politicians who seem to be ignoring the major problems plaguing the country.

Social networking is probably the best way for protests to organize, and this method has been used in America, though sometimes the cause is never fully realized, as shown in this pathetic protest staged against Scientology by members of the group Anonymous here in Atlanta:

Or how about this ultra-embarrassing pro-cannabis demonstration carried out in Atlanta’s Hurt Park?

Both are examples of how social networking can influence and bring people together to fight for a specific cause, but in the case of the Spanish, there seems to be such a collective mindset that something might actually be accomplished. The role of social networking in this revolution—and especially Twitter—will be memorialized within the moniker given to it.

Let’s think about that for a second; in the history books, there will be references to World War II alongside #spanishrevolution. That’s pretty crazy! Let’s hope they succeed in their efforts instead of just getting beaten in the head repeatedly.

Thursday, May 26, 2011

Gmail's New "Important" Markers are Ironically Not

Anybody who has used their Gmail account over the past few days has no doubt noticed a new feature rolled out by Google: The “Important” marker.

The marker and its accompanying tab aim to help you keep track of your most important emails by allowing you to label them as such and then filtering emails for just the “important” ones by clicking the Important tab.

My immediate reaction to this new feature was, “Wait, isn’t that what the star is for?” For years Gmail has allowed you to click the hollowed-out star next to emails to mark them as significant, and you can use the accompanying Starred tab to filter Gmail to display just the starred emails. Seems like the same thing, right?

It pretty much is; the difference is that Gmail’s “important” markers are automatically placed as the emails roll in. The result is a system that guesses what’s important for you—purportedly by amount of previous correspondence with the sender—but doesn’t always get it right. In fact, it seems to be placing the tabs on pretty much everything in my inbox, and missing out on stuff like an email from my wife about home inspectors while we search for a house.

If 80% of what’s in my inbox is considered important, what use is it, really?

You can manually add or delete these “important” markers per email, or in bulk by selecting many of them, but it adds another step to the process, and it puts more tools in the top toolbar. By comparison, my previous method of marking important emails was to click the hollowed-out star next to the intended email:

The new “important” marker requires me to select the email(s) I want to mark by checking the selection box next to each (which you’ll notice is right next to the star):

Then I have to click the plus or minus marker button at the top of the window:

...and then the email is marked as important.

It’s not very difficult, but it adds another step to the process; plus, if emails are coming in already marked, and I don’t want my Important tab cluttered with stuff that really isn’t, I have to manually deselect them as they come in. So why would Google do this, especially since they already had the star system?

The feature is probably aimed at people who get a lot of emails every day in a fast-paced business world. It might help to put the most important emails in their own folder so that you can get that work done quickly. However, it’s kind of like sending potentially important emails into a spam folder, where you’ll have to intentionally dig for the one very important email you can’t find that Gmail improperly categorized.

In my research on the feature, I didn’t find information on the background of its development. In user forums, no one was talking about how much they liked it. Rather, everyone was asking how to make it go away.

New features for popular services, such as the new Facebook profile, tend to be rejected by society at first and eventually accepted, but I think the main problem here is that no one understands why it’s necessary or how it would be useful. Fortunately, Gmail makes its removal easy, and you can always bring it back later.

Because so many people want to know, here's how to remove the important markers from Gmail:

First, click the Settings link at the top of the page.

In the Settings menu, click Labels; next to Important, click hide. This removes the Important tab from Gmail.

Still in the Settings menu, click Priority Inbox; next to Importance markers, click the radio button next to No markers. This removes the markers from all emails and gets rid of the marker-related tools in the toolbar. Don’t forget to click Save Changes!

So there. You don’t have to have it, but you can always bring it back if you like. Note that there is a tutorial video in the Priority Inbox section that explains all of this in greater detail if you really want an explanation of how this whole system is supposed to work.

I tend to trust Google, but they don’t always get it right. We’ll see how this one plays out.

Wednesday, May 25, 2011

Caffeine = Drugs at Work

Does your job drug test? If they do, I’m sincerely sorry. Not only do they not trust you to do a good job despite what you may do in your free time, but they’re spending money (that could have been your raise) to invade employee privacy.

Hey, it’s not that I support drugs in the workplace; it’s just that I don’t care. If you’re doing a good job, then what does it matter? Unless you’re a bus driver, forklift operator, or wood chipper, I doubt it matters if you’ve got some drugs churning in your system. Some companies probably don’t want you to miss work because you got arrested for drug possession, but again, that’s on you, buddy. Do your work correctly, don’t be stupid, and you’ll keep your job.

However, workplaces don’t test you for alcohol, and for good reasons: It’s legal and widespread, and your boss probably uses it too. Don’t be fooled: Alcohol is a drug, and a very powerful dissociative at that. In fact, if you’re a bus driver, forklift operator, or wood chipper, you’d probably be better off not drinking.

This opens up an interesting topic, though. While we’re busy being intolerant of psychotropic drugs—cannabis and LSD specifically—and are wary of sedatives which mostly consist of opiates and opiate derivatives, our culture tends to embrace stimulants. No, not cocaine or methamphetamine, though there are probably many jobs one could do quite well under the influence of those drugs. I’m talking about caffeine.

This makes your brain scream
A rather large segment of our population needs it to kickstart their brains because of a dependency on it, so it’s ingrained in our culture to have a morning coffee, or several, and then more in the afternoon, or perhaps an energy drink. It’s not just tolerated, it’s actually encouraged.

“You look like you need some coffee!” your boss might say to you as you droopily sloth at your desk. The brain translates this into "You look like shit," of course, and no one really wants to look like they just slept in a dumpster. Your mother warned you about peer pressure, yet you give in and drink the caffeinated brew. It’s like Native American mushroom tea, but without the second head that pops out of your shoulder and the sudden urge to run at full speed through the woods.

Instead, minutes after ingestion the chemicals make their way through your intestinal wall and into your bloodstream, arriving at your brain within seconds. You’re not tired—at least not in the sense that you need to sleep. You’ve got energy, possibly too much. Your legs bounce up and down with anticipation. You’re talking a mile a minute. Your pupils probably dilate.

Drugs all do basically the same thing: They travel through the blood and into the brain, triggering it to release oxytocin, dopamine, and/or endorphins. Stimulants go one step further and inhibit sleep chemicals in the brain so that neurons go into overdrive while happy fun-time chemicals are released. Therefore, the result of caffeine use is euphoria, pain relief, and increased energy—all the characteristics of cocaine and methamphetamine.

Caffeine is seen as a “safe” drug, one in which the consequences of dependency and the behavioral modifications that come from its use are considered relatively harmless. Not only can we perform our daily tasks with as much clarity as before, but we seem to be able to actually focus more intensely on the task at hand. This makes it an ideal drug for use by bus drivers, forklift operators, and wood chippers.

The consequences of dependency are short-term and mild by comparison to harder drugs like heroin, but very similar to other stimulants:
  • Tolerance builds quickly, leading to increased use
  • Overdose can cause an agitated or aggressive mood, heart conditions, and possibly death
  • Withdrawl triggers intense cravings, abnormally sluggish behavior, and severe headaches
All of these symptoms are analogous to cocaine and methamphetamine use, so the behavioral modifications that come from its use would have to be more severe, right? These side-effects include:
  • Rapid mood swings from excited to upset
  • Intense focus
  • Severe irritation leading to aggressive behavior
Wow, also identical to cocaine and methamphetamine! What makes those other drugs so dangerous then? I’m not exactly sure, but I’m certain of one thing: Using caffeine equals using drugs, and we love it in our workplaces.

In fact, I work for a company with a coffee-themed name, a $1,000 coffee machine, and an unlimited supply of Red Bull. Our employees get the job done, are perpetually alert and focused, and don’t punch each other in the face, but we’re all on drugs. We might look normal, but we’re all amped up, filled with both adrenalin and endorphins; we’re re-upping at three in the afternoon, and we’re crashing when we get home. We might not be using the ones that employers screen for, but we’re all on drugs.

I Want Internet TV, but I Don't Want to be Brainwashed

Last year my TiVo died, and the company offered me a discount on a TiVo Premiere, their newest digital media box. I didn’t really want to drop $200 on it, but it seemed like a bargain and I needed something to watch cable with. Reluctant to get another Comcast box, with the worst interface of all time, I went ahead and grabbed the Premiere.

Fast forward to May 23, 2011: Hulu Plus is now available through the TiVo Premiere. Oh joy! Now I can spend $8 a month on television instead of the $78 that Comcast charges me. The only downside is that they don’t have every show my wife watches, most notably Gossip Girl. (Wait a minute, this might be a good thing...) No matter! I can always download the episodes after they air—assuming some nice person ripped it.

I have no qualms with downloading content that I can legally record with my TiVo, but if I ditch Comcast, the legality here might become more of a grey area. Regardless, I can stream Gossip Girl from the CW website, so it shouldn’t be an issue, right? Sort of. It’s probably still illegal because I’m bypassing the advertisements that pay for the show.

I ran into this exact situation not thirty minutes later as my wife looked through the TiVo for the latest episode of America’s Best Dance Crew. For some reason, the TiVo didn’t record it; looking at EpGuides.com and TV.com for listing of the episode didn’t help. ABDC’s Wikipedia page had the results from the show listed, so I know it aired. Looking for a torrent to download the episode proved fruitless. It’s like the episode didn’t air, and someone just updated the Wikipedia article anyway.

So I went straight to the source and checked the MTV website, where they advertised “Full Episodes” streaming on their website. Interested in the idea of Hulu, and without an alternative, I chose to watch the full episode of the show straight from MTV.com.

I never do this, so I didn’t know what to expect. However, it was surprisingly just like watching TV. I’m not sure why I was surprised by this, but I guess I expected it to be full of artifacts and lag, buffering randomly in the middle of performances. They did a good job with their web player.

I knew the commercials were inevitable, I just didn’t know how many there would be. Knowing that MTV will sometimes show blocks of commercials six minutes long, I braced for the worst as the first commercial break came on. It was a commercial for Clairol hair products, starring Angela from The Office.

As I watched the 30-second spot, I began sweating, then vocalizing my fear. “Only one commercial!” I yelled. As it neared its end, my fingernails dug into my legs. “ONLY ONE!”

And then, to my dismay, a second commercial began playing. Actually, I shouldn’t say second, considering it was the same commercial again. I watched another 30 seconds of Angela selling hair products. Again, I screamed at the TV: “Please don’t let it be three!!”

It wasn’t. The show came back on. But when the commercial break came back, it was Clairol, back-to-back. More of the same.

Repeat two more times. I watched this commercial, and this commercial only, eight times. I do hate commercials, with their attention-demanding super-loud compressed audio and fast-paced sales pitch to maximize the profitability of the purchased time slot, but this commercial went beyond annoying and became mocking. Here’s the commercial, but please, do me a favor and watch this commercial all the way through to get an idea of what I had to deal with. Now imagine it eight times. If you’re really masochistic, you could even watch it eight times in a row.

Though I enjoyed watching the episode (in which two people I referenced yesterday embraced, coincidentally) the commercials actually had an impact on my decision to watch in the future. They really only had one sponsor? What about all the sponsors that air on cable television? What about the sponsors in the ads on their website? It’s really not acceptable to just spew the same commercial at me over and over again. It goes beyond advertising and feels like the brainwashing it really is.

Internet-streamed TV is definitely the way of the future, but it’s got some bugs to work out. Also, if you want a good pitch for a future-media model, drop by with a suitcase full of cash and I’ll tell you all about it.

Tuesday, May 24, 2011

Lady Gaga Clones will Make Us Feel Stupid in 2020

One of the latest in a long line of shock rockers, Lady Gaga caught the world off-guard in 2010 with her outrageous attire more so than her semi-listenable music. Part of both a marketing scheme and an art collective, the singer uses her body as an experimental canvas for avante garde fashion—stunning, surprising, and sometimes disgusting the general public. Outfits have ranged from relatively classy dresses to glorified underwear to raw meat.

The world perceives this as a new concept. The truth is that her outfits and approach to the style are relatively original, but the concept itself is not. We’ve seen dozens of artists over the years push the envelope and reach widespread acknowledgement, including Marilyn Manson, Alice Cooper, David Bowie, and Iggy Pop. More accurately, you could compare her style to Madonna (but she’ll be upset if you do).

Yes, in the 80s and 90s, Madonna repeatedly shocked the world by being overtly sexual, performing in underwear and posing in the nude. She was heavily involved in the fashion industry and had a team of people who worked on her image. Those days are over, but only for her.

While the rest of the music industry were off acting like space robots, Lady Gaga jumped onto the scene with Swedish-pop-influenced dance tunes and music videos with a creative, fresh style. She heads a group of artists not unlike those of Andy Warhol’s Factory who constantly brainstorm and create new looks for her. When she appears in public wearing such outfits, reactions are polarized. Indeed, her mysterious persona actually lends her a bit of credibility in this regard, as if she were a living, singing Warhol.

Unfortunately her bold experimentation has given other artists carte blanche to dress wacky, hiding behind the excuse that “Lady Gaga’s doing it.” They have erroneously interpreted this newfound fashion sense to be an excuse for themselves to “push the envelope,” and though this might succeed in some cases, it usually just feels weird.

Lady Gaga put herself on the chopping block and took a chance, inviting ridicule and potential failure in the process. Her progression through increasingly ridiculous costumes is more of a linear trend than an intentional ploy; her collective simply builds on the last piece’s objectives. Being known for her over-the-top style is why she gets away with it.

New Age Elvira
But these other artists are guilty of attempts to put themselves into her position. Sure, they may have artists approaching them to wear their hideous creations and risk-taking fashion experiments, but it’s not quite like the collective that generates the Gaga attire. Probably the biggest current offender is singer Nicki Minaj.

She started out trying the sexual route like Lil’ Kim, morphed into a space robot, and then ended up Gaga-esque, even if her music is vastly different. Because she doesn’t carry that air of mystique, and we know a good bit about her authentic personality, her fashion sense just come off as ridiculous.

The eye claw
Famed singer—ahem, talker—Kesha, known for a few songs about drinking too much and vomiting all over yourself, has tried the Gaga vibe herself, but to no avail. In fact, fans have quickly picked up on the imitation, not homage, and pointed to fashion experiments that mirror those Gaga already tried.

Lil’ Mama, currently well known as a judge on America’s Best Dance Crew has progressed from her beloved hop-hop style into an arguably unfortunate style. I'm not even going to describe it. Just look.

Lil' Mama
Yes, pushing the envelope is good; shocking parents is fun, and kids like it. Experimentation leads to breakthroughs in fashion and art. This is one reason for such stupid looking outfits in “high fashion” shows: It’s not because anybody’s actually going to wear that stuff, it’s because no one’s seen it before and some aspect of it will probably catch on and become popular.

When we have an artist such as Lady Gaga trying something different, and it’s her thing, we can all look back on it and laugh in ten years. But if everyone’s else is doing it too, we’ll just end up embarrassed for the entire era. Remember the 80s?

UPDATE: I watched the latest episode of America's Best Dance Crew tonight. Nicki Minaj and Lil' Mama touched without exploding.

Monday, May 23, 2011

Anthropomorphic Food Sabotages Itself

My mom always says, “I won’t eat anything with a face.” While she’s referring to food that still has the face attached, this is part of the guilt-inducing aspect of meat-eating that causes lots of people to go veggie. The idea that the food you’re currently eating was at one time sentient and capable of higher thinking, self-awareness, reminiscence, and even dreams, is too much for some people to take.

This doesn’t make you a bad person for eating meat, but if you do, you fit into one of three categories:
  1. You enjoy the thought of being the more successful carnivore.
  2. You’re neutral on the subject and don’t care. It’s just the way things are.
  3. You try not to think about it too hard because bacon is just so damn good.
The consciousness of animals prior to consumption is a major hurdle that the meat industry must clear to keep up their profits, since it’s a pretty easy way to lose customers. If their product never had a face, they’d probably sell a lot more of it.

This is why it baffles me when a food company chooses a marketing campaign in which they anthropomorphize their product. Taking a product that has no ability to think and giving it human qualities may increase the humor in the advertisement and therefore its effectiveness, but it also willingly adds this extra burden to the process. Once you’ve seen the product exhibit its human-like characteristics, you’ve got to fit into one of the three categories shown above, or you discontinue use of the product altogether.

Foremost in this category is the branding used for M&Ms for quite some time now. In nearly all commercials for the chocolate candies, a regular M&M is paired up with a peanut-filled M&M for Laurel-and-Hardy-style hijinks. In some of these advertisements there are even direct references to the duo being consumed, leading to sheer terror on the faces of the characters. This commercial features a representative of each of the four main types of M&Ms being produced:

These commercials even go so far as to portray disturbing and reluctant experiments performed on the sentient candies—purely for the pleasure of consuming them. This commercial suggests an uncomfortable situation for everyone involved:

But it gets worse. McDonald’s used to be one of the worst offenders with an entire cast of disturbing food/human mutants including Mayor McCheese and the Fry Guys, and they’ve since retired their entire cast, save for the beef-loving clown. However, the most disturbing commercials in the restaurant’s history involved the imagery of Chicken McNuggets eagerly jumping into barbecue sauce—which, as we all know, is the last step before being torn to shreds by dutiful incisors. This commercial specifically drives home the point, as Ronald lovingly recounts the short life of the McNugget bunch and alludes to their doom, all the while patronizing the little victims’ cheerful mood:

This type of marketing may induce a bit of guilt, but it is at the same time intended to be humorous and entertaining. Likely the humor wins out over the guilt, and product sales increase as a result, but when you start to add cannibalism to the conversation, things start to get a bit dicey. I’m referring specifically to the use of pigs on barbecue restaurant signs.

Sure, pigs will eat anything, but this one is a serious traitor who just so happens to be selling out his brethren for consumption by humans. I’d actually feel better about this entire situation if they used a dead cartoon pig with Xs in its eyes rather than the curiously confident pork server carrying a tray of dead pig flesh. But hey, most barbecue customers probably fit into category 1 or 3 anyway.

But the icing on the cake—scratch that; the mouth crammed full of ground beef is the Hot Pockets Sideshots commercials which feature a trio of brothers describing how delicious they taste just before one of them is whisked away for execution. The remaining pair, who were so confident in their flavor just moments before, stand trembling, paralyzed with fear and fully aware of their ultimate demise:

There are two main things about this marketing campaign that ensure that I will never, ever eat this product. The first thing is the aforementioned ball-gag-o-beef shoved into each mouth, which looks like the little guys are vomiting their internal organs. The second is the anthropomorphic fear attributed to the characters. Rather than the M&Ms, who treat their consumption with as much finality as a shotgun blast to the face in a Bugs Bunny cartoon, or the Chicken McNuggets who eagerly dive into barbecue sauce as if being eaten is their only joy in life, the Sideshots clearly don’t want to be food. Their horrified expressions in the end seal the deal.

So why give human qualities to products? It’s entertaining. The ultimate purpose is to sell the product, and boring marketing schemes rarely work, so generating interest either by comedic situations or even controversy generally works well. Awareness, more than anything, is important; the more people that know about your product, the more potential customers you have. Just don’t expect me to eat your anthropomorphized food any time soon.

Sunday, May 22, 2011

Dandies: The Original Hipster

Sure, we’re all aware of the hipster culture in our cities. They crowd dive bars, lock up their fixed-gear bikes to every telephone pole in sight, and tend to have an all-around holier-than-thou attitude. Their passion for organic food and fingerless gloves can be annoying, but we’ve got to learn to live with them.

This is not the first era to experience a segment of the population that takes itself more seriously than it actually is, of course. In fact, the hipster can be traced back through several generations, at least as far as the 18th century dandy.

For those that don’t know, a dandy was (and arguably remains) a man who portrays himself with a style that Charles Baudelaire described as the “cult of self.” It is for these reasons that I consider the classic dandy to be the original hipster. Let’s look at a few side-by-side comparisons to explore this topic further.


As a male, it’s hard to find accessories that work. Rather than just throw on jewelry—rings, necklaces, bracelets—the hipster opts to utilize clothing to decorate a part of the body that’s often overlooked by the male population: The neck. This amounts to wrapping up in a scarf—even in 90 degree weather.

The dandy was a big fan of a similar sort of neckwear: The ascot. Championed by the gentry of the late 18th century, the ascot became an integral part of the dandy’s attire, and hardly a self-respecting fashionable male would be caught dead without one.

Method of Transportation

Permanently on a quest to “make a difference” and reduce their dependance on fossil fuels, the typical hipster’s transportation of choice is the bike, but not just any two-wheeled apparatus will do. To emphasize the idea that they don’t need to go very far to accomplish their daily routine, hipsters prefer a fixed gear bike, or “fixie” to the 3-, 10-, or 21-speed bikes popularized by the rest of the population. The result is a single-speed, foot-powered device they use to roll from place to place.

Though it was short-lived, a similar fad erupted in the 1820s in which dandies scooted around on a pioneer of two-wheeled transportation called the “dandy horse.” Lacking gears entirely, the dandy horse was an all-foot powered device that allowed menfolk to glide around town in style—without having to rely on the barbaric custom of riding a horse.

Being Mistaken as Homosexual

Being an eclectic dresser has its consequences. The term metrosexual, used to describe a man who takes his fashion seriously, has made its way into the average American’s lexicon, and it can describe both groups quite well. Due to the hipster’s obsession with hair products, tight jeans, and satchel bags, some people will mistake a hipster for a woman from afar; upon closer examination, realizing that the person is male, the hipster will often be erroneously identified as a homosexual.

Of course, some hipsters are homosexual, and the term dandy has grown to be a somewhat derogatory term for a male homosexual anyway. By the 1920s, dandies had become less tolerated than in their 19th century heyday, and conservative individuals began to use the word to describe an effeminate man, often described as a dandy fop. The subculture came under heavy scrutiny of anti-gay citizens until the cultural revolutions of the 1960s.


Hipsters have been known to make disapproving guttural noises when overhearing someone else talking about their musical tastes. They also tend to hang around in areas where they can be seen using the brand new laptop they just bought, which is clearly better than yours. If you don't know why, they're not even going to bother to explain it to you.

Likewise, the dandy was often characterized by an elitist attitude, being men who cared intensely for their fashion and taste in decoration and lifestyle in general. Dandies were often seen standing in groups near the entrances to theaters and informing customers that they'd "already seen it" and that it "wasn't as good as [obscure play]."

If all of this wasn't convincing enough, consider that American dandies became known as "dudes" by the end of the 19th century. Now picture a room full of Victorian-era hipsters riding their dandy horses to the nearest theater in 90-degree weather while wearing fanciful neckwear and calling each other "dude." Sound familiar?

Saturday, May 21, 2011

Nintendo Always Gets it Right

From the moment my friend’s mom brought home a game called Metroid, I knew I was about to begin a life-long friendship. My friends down the street were a bit more wealthy than my family was, and they had rushed out and bought the brand new Nintendo Entertainment System as soon as it hit the market in 1985, while I made do with my Commodore 64 at home. The C64 was an excellent, versatile machine in its time, but the Nintendo had the advantage of convenience: Instant load times and two buttons on the controller.

Nintendo made nearly all of the best games on their own console, from the heralded Super Mario Bros. series, to Punch-Out, Metroid, and Kid Icarus. A large cult following grew around the Legend of Zelda games that exists to this day. This console was a smash hit and revived the dying home-gaming market. As a companion for kids who just couldn’t get enough gaming, the Game Boy broke ground as the first seriously good portable game device—a stark contrast to the dismal Game and Watch portable games you’d nag your mom for in the checkout line of the toy store.

Even before we were sick of the 8-bit NES, along came the Super Nintendo, blowing our minds with simulated 3-D action, a much larger color palette, and extra voices in the audio. Immediately, Super Mario World was the front-runner, the enormous leap in technology we hadn’t expected. Other Nintendo-produced games for the SNES included Super Mario Kart, Pilotwings, and F-Zero—all incredible first-person action games. Unfortunately, most of the games made for this console were forgettable third-party attempts, consisting mostly of horrific cartoon tie-ins, sports games, and unplayable racing games. Nintendo and Capcom were the only ones who seemed to get it right, and when they did, the results were phenomenal.

Even before these two legendary consoles arrived, Nintendo was cranking out the hits, first striking gold with a little game called Donkey Kong in 1981. This classic stand-up arcade machine would draw a crowd, being a sure bet for profit in any arcade, and marking the first introduction of the face of Nintendo: Mario (though he was known as Jumpman at this time.) Other great arcade games including the original Punch-Out and Mario Bros. also succeeded greatly, pulling in mountains of quarters.

So after fifteen years in an industry with a massive turnover rate, the bubble had to burst, right? Nope; in 1996, Nintendo released the N64, the first 64-bit CPU to enter a home gaming unit. In classic Nintendo tradition, they released the first “true” 3-D video game, Super Mario 64, and revolutionized the platformer genre in the process. With its wacky game controllers sporting an ever-increasing number of buttons, the games opened up new avenues for gaming possibilities; no longer were your movement choices restricted to merely jumping, running, punching or kicking, but they could now control item switching, maps, flying, sharp turning, spinning, and just about anything else a programmer wanted a video game character to do. The original first-person shooter Nintendo developed for this system, Goldeneye, was so popular that it was given a complete re-programming and released on the Wii more than a decade later. The much-anticipated kart sequel, Super Mario Kart 64, remains one of the most beloved video games of all time.

Meanwhile, all Nintendo’s previous competitors died off: Atari went bankrupt; the C64 ceased production in 1994; Sega, developers of the incredible Master System and Genesis, with its stiff competition in games such as Sonic the Hedgehog, was chugging along, trying to keep up. Though Sega was clearly the Nintendo’s competition and had a superior mobile gaming device on the market (the Game Gear), the N64 completely killed off the demand for the Sega Saturn, which had only been on the market for a year before it was cut off. Sega responded by focusing on its next generation console in 1999, the Dreamcast, which featured incredible graphics but was short-lived due to new consoles from Nintendo and the long-time electronics industry champion, Sony.

The same year that the N64 hit stores, Sony introduced the Playstation, and the two choked Sega’s profits. No longer was it Nintendo vs. Sega, but Nintendo vs. Sony. Both companies raced to catch up to the Dreamcast and bring something to market. Sony’s PS2 came about in 2000, and Nintendo launched the GameCube in 2001. While the PS2 knocked its competitor off its pedestal, Nintendo continued to release super high-quality games for their less flashy console, beginning to focus more on the upbeat, cutesy market that Sony had no history of. The results included Mario Kart: Double Dash!! and the surprise hit that broke down the walls between video games, Super Smash Bros. This game pitted Nintendo’s classic characters, from Mario and Luigi to Samus from Metroid and Link from the Legend of Zelda games against each other in rounds of full-on battle.

At this time, Microsoft entered the market with the Xbox and stole away the first-person shooter market almost entirely with their excellent online multiplayer support. This proved to be a much bigger jab at Sony’s share of the industry as Nintendo continued to move more toward RPGs and puzzle games and released a new handheld game system called the Game Boy Advance to compete with Sony's PSP. Enter the seventh generation of home consoles.

As Microsoft released its Xbox 360, improving on much of the same format that succeeded before, Sony released a hulking, steroid-filled console dubbed—predictably—the PS3, designed more as an all-encompassing home entertainment system (complete with a Bluray disc player.) However, Nintendo made a bold move and revolutionized the controller. The result was the 2006 release of the Wii, which included the unique Wiimote, a stick-looking device with few buttons that featured accelerometers to detect its movement through the air, allowing for an extra level of realistic interaction with the game. The redesign was a sleeper hit, picking up momentum as it began to be realized as a valuable tool for people not familiar with controllers that featured dozens of buttons, particularly older gamers who were already middle-aged by the time the NES came to be.

Simultaneously, the company released its third hand-held game system, the Nintendo DS, which featured two color screens, one of which was tappable. The buttons-and-stylus method was a success, being familiar to those who were already familiar with using a stylus on their PDAs. Having little-to-no competition in this field, the DS dominated the market.

Meanwhile, the Wii, with its interactive sports games already popular with a hugely wide demographic, continuously released sequels that built upon their previous hits: The Legend of Zelda: Twilight Princess; Metroid Prime 3: Corruption; and of course, the Mario games: Super Mario Kart Wii, Super Smash Bros. Brawl, Super Mario Bros. Wii, and two of the greatest games ever made, Super Mario Galaxy and its sequel. Each one of these games achieved such an unprecedented level of success and acclaim that it felt that the company might never slip backwards.

To compete with this revolutionary control redesign, Microsoft developed the Kinect, which was released last year, and Sony developed its own wand-like device. Though both seem to be viable alternatives to the Wiimote, Nintendo’s looking forward. They recently released the first true 3-D game console with the 3DS and are getting ready to make some big announcements about their upcoming console, tentatively known as the Stream.

Little is known about the Stream (which is officially being referred to as "Project Cafe"), except that it will be larger and much more powerful than the Wii (including HD capabilities for the first time), and the controller will more closely resemble the traditional two-hand controller used on virtually every other home console ever made, with the exception of a color touchscreen placed in the middle. Though skeptics abound with every released statement regarding this console, I’ve got no fear. The company has a 30-year history of cranking out the most high-quality games on the market without a single mistake; with sales of the PS3 dropping and the next-generation Xbox nowhere to be seen, I think Nintendo will handle itself just fine.

Friday, May 20, 2011

Obituary: "It's On Like Donkey Kong"

The phrase “It’s On Like Donkey Kong” passed away last night. It was 19 years old.

It was born in 1992 to a young Ice Cube on the hip-hop album The Predator. With the help of producer DJ Muggs, it appeared as the first line of the song “Now I Gotta Wet’cha.” Its siblings, “You wanted that fast buck” and “Now I gotta light that ass up” didn’t survive, but the phrase itself became well-known within the hip-hop community, enjoying mild fame before living a relatively quiet adolescence.

Content warning: Liberal use of the "N" word by an African-American rapper
The phrase was named after the extremely popular 1981 video game Donkey Kong, a favorite in arcades worldwide. “It’s On Like Donkey Kong” referred to the frenzy that surrounded the arcade cabinets as video game enthusiasts controlled an early portrayal of the famous video game character Mario to save a damsel from the eponymous giant ape.

Good friend Urban Dictionary shed some light on the phrase’s legacy: “It was a phrase used to denote that it was time to throw down or compete at a high level; that something was about to go down. I’ll miss the fool.”

Though it had a bit of trouble as an adolescent in the mid-90s, things picked up in 2003 when it fully developed into a culturally ubiquitous phrase thanks to its use by the character Stifler in the movie American Wedding, though the “biotch” that appeared alongside of it didn’t catch on.

(“Biotch” would later confront “It’s On Like Donkey Kong” in an awkward encounter on Facebook after a couple years of abusing anabolic steroids and lifting weights.)

Great fame followed, and the phrase was a favorite of frat boys, high school students, and ironic t-shirts across America. For a significant period it was “the” hot phrase and enjoyed widespread fame, even being used by professional journalists in news publications as prominent as The Boston Herald.

However, things took a turn for the worse on Christmas Day, 2008, when Jetcomx published their article defaming the phrase, calling it silly and irrelevant, even in its original use by Ice Cube. In the article, Micah Nathan lamented that “Donkey Kong isn’t a gangster game. It’s not even close. It’s a mustached Italian guy trying to rescue his girlfriend while a giant ape throws barrels at him.” The article proved to be a crushing blow that the phrase never quite recovered from.

Late in life, the phrase appeared in a few movies including the 2010 rom-com The Switch. The phrase was even offered its own theme song by dark-comedy hip-hop act Blood on the Dance Floor, accepting the proposition as its fame began to wane, but the song never caught on, and “It’s On Like Donkey Kong” retreated from the public eye for several months.

Content warning: Liberal use of sexual innuendo by a tiny caucasian male
During this time of great struggle, Nintendo, the company who created the phrase’s namesake, began a campaign to raise awareness of the phrase. They filed a trademark request with the U.S. Patent and Trademark Office attempting to receive custody, and booked it for an appearance promoting the new video game Donkey Kong Country Returns. Unfortunately, it seems that it was just too late.

The phrase passed away on May 19, 2011, as E! News’ Jason Kennedy read it off of a teleprompter while referencing a news story about Maria Shriver filing for a divorce from Arnold Schwarzenegger. Exhausted, overused, and no longer culturally relevant, the nineteen-year-old phrase couldn’t take anymore and expired peacefully.

It is survived by its siblings “My Bad” and “Chillin’ Like a Villain.”

5 End-of-the-World Prophecies that Didn't Come True

Sorry, you’ve only got one more day to live. At least that’s what the people from www.wecanknow.com want you to believe, as they’ve predicted the apocalypse to begin on May 21, 2011.

Why Saturday, May 21st? Well, it just so happens to be 7,000 years to the day from when Noah first set out in his ark with his menagerie of animal pairs. Duh!

In celebration of the end of the world, let’s take a look back at five of the greatest end-of-the-world prophecies that never came true.

  1. Millennial Return, 1000
The original date of mass hysteria. It only seemed to make sense to the people of the late 10th century that Jesus would reappear on January 1st, 1000. After all, it’s a round number, right? Are people really going to have to wait longer than one millennium for the guy to come back and take the true believers?

In true apocalyptic fashion, caravans of believers roamed the world prosthelytizing, trying to be as saintly as possible, and ridding themselves of all their possessions. Everyone was so certain that adding an extra digit to the year tally would bring about widespread destruction that they didn’t plan for a single moment beyond that day. As an act of kindness, all criminals were released from prison to live out their last days on Earth with freedom—freedom to live a consequences-free doomed lifestyle for a few more days.

Of course, the year 1000 arrived with no apocalyptic horsemen. A stunned and baffled Christian population went back to work, rebuilding the society they’d temporarily allowed to crumble. Oh, also, a bunch of recently released criminals ran far, far away.

  1. The German Peasants’ War, 1525
Leading an angry and simultaneously frightened swarm of German peasants, a theologian named Muntzer declared his conviction that Jesus would arrive more than 500 years late because he and his peasants hadn’t risen up to destroy The Machine. Apparently excited about the prospect of an early apocalypse, he amassed an enormous army of villagers. These peasants, armed with the power of Christ (and no doubt pitchforks), sparked what is now referred to as the German Peasants’ War.

Peasants having a stick fight
To someone not paying close attention, this squabble might appear to be a civil war, but it was actually their plan to see Jesus and go to heaven. They decided that they'd have to start the apocalypse, and they put in a great effort—until the German army showed up.

Muntzer had a vision from God in which he caught the cannonballs being fired at his army like a highly lethal game of dodgeball. Instead, the cannonballs took down close to 100,000 peasants, and Muntzer himself was captured, tortured, and decapitated. Oops!

  1. The Great Disappointment, 1844
Baptist preacher William Miller became so obsessed with the book of Daniel that he managed to interpret Jesus’ ETA to be sometime before March 21, 1844. He was apparently a really excellent preacher, because his argument for the impending apocalypse convinced enough people that a movement called Millerism formed, with his followers calling themselves Millerites.

Miller's followers actually had to pry the exact date out of him, since he didn’t want to commit to a specific day. Instead, he gave a one-year window. When the prophesized date came and went, the Millerites rallied together and re-estimated the date of October 22, 1844.

When nothing happened, yet again, some of his followers actually moped around for days, “sick with disappointment,” while townsfolk became enraged and violent toward the Millerites. Miller himself continued to wait for Jesus for five more years before he died, but his prophesy lives on—now known as the Great Disappointment.

  1.  The Planets Align, 1982
Scary poster from the 70s
In the first major end-of-the-world scenario that didn't involve Jesus showing up and wrecking the party, some scientists forecasted a catastrophic celestial event for 1982 in which all the planets in the solar system would align, their gravitational pulls focused in such a way that it would cause extreme natural disasters on Earth, from massive earthquakes to solar flares engulfing the planet. California was singled out as the most susceptible zone.

Newsweek magazine first reported on the impending end of life-as-we-know-it in 1974, allowing for a full eight years of preparatory panic. In the meantime, a science editor of Nature magazine and a NASA scientist co-wrote a book called The Jupiter Effect in which it was proposed that Jupiter would exact a force on the Earth similar to the tidal changes brought about by the Moon.

As the planets aligned and the world winced a little bit, many prepared for the worst. The planets then resumed their staggered positions around the Sun, and nothing happened. Again.

  1. We Can Know, 2011
In 1992, Harold Camping, a Christian radio broadcaster, added up numbers in the Bible to equal a total of 1994; he then wrote a book entitled 1994? that suggested that a new, accurate apocalypse date might be September 6, 1994. The idea for the campaign stemmed from the belief by most Christians that it would be impossible to know the exact date of the apocalypse, but Camping argued that certain signs in the Bible actually allow us to know, and that a more likely doomsday would be May 21, 2011.

Moving billboard/fishtank
Due to his saturation in the media—owning dozens of radio stations and thousands of billboards—Camping’s prophesy gained steady momentum and attracted a large number of followers who turned their cars and trucks into moving advertisements. Just as in 999, believers took to roaming the Earth, preaching of the end, and giving up their worldly possessions.

After a five-month period of turmoil and suffering on Earth, time will end on October 21, 2011. Enjoy the Rapture—it starts tomorrow!

Thursday, May 19, 2011

He's Just a Clown.

McDonald's, the fast food juggernaut with unbelievable staying power, recently became the target of nutrition advocates who argue that the company's kid-friendly mascot encourages children to eat food which makes them overweight. This isn't the first time the company has been at the forefront of the childhood obesity epidemic, and it's not the first time that someone has suggested that Ronald McDonald retire.

Developed in the 1960s and played by the delightfully weird Willard Scott, Ronald McDonald—the character—began as a typical silly clown, a marketing scheme that few expected to stick around, much less morph into the ketchup-and-mustard-colored smiling spokesperson he is fifty years later.

As you can see, the mascot probably turned off a good number of potential customers who suffer from coulrophobia, especially at stores where a life-size replica of the guy sits like an embalmed stiff on a bench outside. Regardless, the argument from the dissenters insists that McDonald's unfairly uses the clown to market to children, who then drive themselves to the restaurant every morning for Egg McMuffins and will eat lunch there 4-5 times per week.

I can see their point; after all, for those kids who can't drive, they've generally got very well-developed nagging skills which most parents are powerless against. There's nothing like a screaming, out-of-control kid who's not getting his way, and plastic hamburgers are excellent ways to shut their faces.

We've seen explorations of McDonald's nutrition in documentaries like Super Size Me, making convincing arguments about how the chain has used its strategically-placed playgrounds in rural areas that lack a public playground to act as a whining double-whammy against even those parents who are excellent at ignoring their child's unreasonable demands. Also, they give away toys with their food. A parent just can't compete.

But actually, those playgrounds kind of act as a public service, bringing a fun play area to a region that may not be fortunate enough to afford one, and supplying plastic food at the same time. It's two birds with one stone for many parents. In fact, it's actually a lot better than McDonald's supplying a bunch of couches with video games, right? After downing 900 calories worth of food, that kid's gotta burn off something.

I'm actually pretty certain that the clown, the playgrounds, and the toys are not the cause of childhood obesity, but combined with the endorphin-releasing plastic food that kids crave, the whole package becomes a whine-inducer so strong, so powerful, that parents will repeatedly take their kids there despite the obvious health concerns.

McDonald's takes a different approach. Rather than putting the rightful blame on parents for letting their children's weight get out of control, they continue to insist that their food is perfectly healthy in moderation, and even go as far as to say that its food is "high quality."

What good would firing Ronald McDonald do? He didn't do anything wrong. He didn't force-feed the kids like the cows that were made into the hamburgers. He didn't drop an f-bomb on the air at any point. He didn't even give inappropriate hugs to little children. McDonald's already quietly retired its entire cast of anthropomorphic characters, including (the very unfortunately named) Grimace, Birdie the Early Bird, Mayor McCheese, Officer Big Mac, the Fry Guys, Hamburglar, Cosmc, Mac Tonight, The Professor, Captain Crook, the McNugget Buddies, Grimace's Irish Uncle O'Grimacey, the Hamburger Patch, the Happy Meal Gang, Bernice, and even Ronald's dog, Sundae. THEY TOOK AWAY THE POOR BASTARD'S DOG. LET HIM KEEP HIS JOB! Sorry.

Ask a little kid: Which part of McDonald's do you like most? Is it:
  • The endorphin-releasing plastic food
  • The toys included with the plastic food
  • The playground
  • The fact that mom takes you there every time you cry
  • Or the beef-loving clown?
My guess is they're not going for the stiff clown with a fixed stare and permanently crossed leg propped up on the bench outside. Actually, McDonald's is probably doing its part to prevent childhood obesity by accelerating the development of the vegetarian lifestyle partially inspired by the disturbing imagery of a beef-loving clown. Let's take a look at some statistics concerning American children:
  • Percentage of children that go to McDonald’s at least once a month: 85% 
  • Percentage of children who are obese: 33%
  • Percentage of children with crippling clown fear: 15%
Actual figures retrieved from semi-reputable sources

Well, that's alarming. Imagine if we removed the clown element altogether. That obesity rate would skyrocket!

McDonald's is not going to drop their biggest marketing device. Ronald McDonald is one of the most recognized characters of all time, and is identifiable in nearly every country in the world, as the Thai Ronald McDonald statue shown above proves. Dropping him would be one of the stupidest moves in marketing history. It might be different if he was dressed up in blackface, but he's not. He's just a clown.