time travel

Wait a minute, Doc. Ah… Are you telling me that you built a time machine… out of an industrial age understanding of the arbitrary segmented concept of time as it pertains to human understanding of the revolutions of our world around the yellow dwarf, main sequence star that we orbit?

Time travel is an interesting concept, and it is a wholly new one. Most science fiction concepts that we have come to know are really just rehashed versions of older ideas. For example, robots are merely re-imagined version of clay Golems, and there are even ancient stories of many cultures pertaining to space travel and visiting other worlds. Yet, time travel is a completely new concept for our society -relatively speaking- and that has a lot to do with how we humans have come to see the movement of the sun and time in our world.

Forward to the Past
Now, let’s be clear. The concept of moving through time does have some ancient roots. Tales like the Mahabharata, the Japanese story of Urashima Tarō, and the Jewish tale of Honi ha-M’agel all talk about movement in time. The most common tale is of a character that leaves his home, gets into some otherworldly shenanigans, and then comes back only to discover that it is many years in the future. Everyone they know is dead and they have long been forgotten. These tales, though they involve moving forward in time, are not time travel stories as we think of the modern concept.

When the characters return to their lives they discover that the world has changed, but not in any significant way. The world still remains as it always was, but the people are different and the character discovers that they have not only suffered a mortal death, but a second death. That is the death you suffer when there is no one left who remembers you or your deeds in the world. These are stories more about the tragedy of mortality and the concept of remembrance, rather than the concept of moving through time. They do not talk about the progress of the world or its people. They focus on the allegorical understanding of mortality and the tragedy/reality of insignificance.

That is because of how our ancestors thought about time and the movement of the heavenly bodies. A lot of ancient cultures perceived time in a cyclical manner. They rose with the sun and set with it too. Every day was an affirmation that the world ran on cycles. All things young would become old and the world would continue in a series of cycles the way it always had. They ate when they were hungry, worked when the sun was up, planted with the seasons, and slept with the night. It was an existence without an understanding of what 5:00 am meant, or 11:34 am, or 6:45 pm. Those arbitrary numbers meant nothing to them. They judged the day by the passing of the sun or the movement of the people and the animals around them. To them stories about moving forward in time were more personal, because time was a more personal concept. It was the cycle of your life, which was just a part of a larger series of cycles. When Urashima Tarō is flung into the future, his own cycle is disrupted and he finds himself in a new one. This is a completely different understanding than how we in America think of time today.

Wibbly Wobbly Linear Time
Time is a property of space, but it is also a concept of human understanding. Even today different cultures have different understandings of time. Many Asian and eastern cultures still adhere to a version of cyclical time. While, many Mediterranean people, like Italians, Spanish, Greeks, and some Arabic cultures adhere to what is called Multi-Active Time, which is where time is valuable but not as valuable as relationships. Appointments can be pushed and the passing of the clocks can be ignored if something or someone more important arises during the day. Most Western cultures, especially Americans, British, Germans, and Swiss, however, adhere to linear time. That is the belief that efficiency comes from sticking to schedules. If a bus is meant to leave at 12:02, than it had better leave at 12:02. We run our lives based upon the ticking of our clocks. We see time as a straight-line, from the past to the future, and maybe it is no surprise that from these cultures the first modern tales of time travel arose.

The concept of linear time has its beginning in the Renaissance when early clocks began to be produced, but it was not until the Industrial Revolution, that the concept really caught on. There is a reason that Greenwich Mean Time is the standard average time of the world. The Industrial Age began in the UK, and it forever changed how we perceive time. -That may also be the reason why one of our most famous time travelers also calls the UK home, but that is just conjecture.- The working populace was no longer bound to the sun and the fields, but the clock and the factory line. The perception of time was also bolstered by the mass production of clocks and pocket watches. Suddenly, it was fashionable to wear timekeeping pieces and have clocks in your own home. The people of London and elsewhere were literally surrounded by reminders of time.

Enter into this atmosphere HG Wells. Wells was not the first person to write a modern time travel story, but he was the most memorable. He even coined the term for the device that travels through time, The Time Machine. In his 1895 book a scientist invents a machine that allows him to travel to the future to a world completely alien to his own. Wells incorporated other contemporary scientific understanding into his work, most prominently Darwin’s Theory of Evolution. This is note worthy because the new scientific understanding of evolution as well as the measurable progression of technology also helped propel the human race’s understanding of how time affects our world. The Time Machine was one of the first modern time travel stories because it shows a concept how the world and its people change from time period to time period. Wells is not necessarily concerned with the personal journey of the traveler, but the journey of time itself as it molds our future and our species.

There were also precursors to Well’s story. For instance, Edward Page Mitchell, was the first person to write about a device to travel back to the 16th century. Yet, one we should focus on is Washington Irving’s 1819 Rip Van Winkle, and that is worth mentioning for two reasons. First, it follows the tradition of “man wakes up in the future,” which we talked about with earlier examples, but there is an American twist. Rip falls asleep in the British Colonies and wakes up in the United States of America. His son is grown, his friends are dead, and his whole country is different. That last part is the important aspect, because it registers a change in the world. This change is more political than technological, but it still lends itself to an awareness of the passage of time. We hesitate to call it true time travel, but it shows an evolution from cyclical to linear thinking.

Bill and Ted’s Excellent Paradox
Philosopher John Hospers wrote in 1953 that time travel was “logically impossible.” What he was perhaps trying to say, is that time travel is hard, and wrapping your brain around it is even harder. Any trip you take to the past would create a paradox, in both time and our understanding of time. What if you go back and kill your own father, or -even worse- learn that he was actually a pretty cool guy before he had kids? These sort of brain bending concepts may be why the majority of those original time travel stories were about people traveling to the future.

Time travel stories in the 19th century did examine the past, such as Charles Dickens’ A Christmas Carol, where Scrooge is sent back in time to observe his own childhood. Mark Twain’s Connecticut Yankee in King Arthur’s Court is also notable, as it send its protagonist to the past, but Twain was more concerned with lampooning chivalry than with any questions of paradox. Thus, a lot of those early “travel to the past” stories were more about adventure or fancy. For us, the most interesting time travel concept emerged in an odd place, The Defence of Duffer’s Drift. Written in 1904 by Major General Sir Ernest Dunlop Swinton, -which is a name more English than meat pies- it details the adventure of Lieutenant N. Backsight Forethought during the Boer War. His unit is attacked in six “dreams.” Each time Lieutenant Foresight uses the knowledge of the past dream to change his tactics and learn form his mistakes. It was written to promote critical thinking in the British military, but in doing so it also captures an essence of why we tell time travel stories in the first place.

As Hospers pointed out, the notion of time travel is nearly inconceivable from a logical standpoint, and yet we do it all the time: Back to the Future, Bill and Ted’s Excellent Adventure, Terminator, Star Trek, Doctor WhoFuturama, and more tend to deal with going back to the past and creating paradoxes. In a way this new genre has helped us think differently about how time operates and about how we operate in time. Which time period would you travel to? How would you change history if you could? What do you want to see most about the future? These are questions that our ancestors rarely asked themselves. They rarely thought of them, because there were no words and no ideas on which to base them. Time travel literature has expanded our societal understanding. It has challenged us to think in new ways and that is kind of the point.

The City on the Edge of Understanding
Early time travel tales were personal. They were about people’s lives, because time was a part of us. Then, starting in the early 19th century, time travel became much more cultural. It stopped being about just us alone, and it became more about the world in which we inhabit. After 1887, there was a time travel story published almost every year. After 1950, there were time travel stories being published one or two every year. These days there are hundreds of time travel stories published every year. As we have watched our technology evolve, our political landscape grow, and our world change we have become more aware of the passage of time and the many ways in which it could or should have gone awry. Democracy itself contributes to this, as we continually find ourselves living between regimes and buffeted in the currents of change.

As life imitates art, so does life imitate time travel. These stories have not only come about because of our new concepts of time, but they have contributed to them. We have become a more appreciative of time and the ways in which it ebbs and flows. The “logical impossibility” of Hospers has been conquered in our mind, and replaced with a longing for the past, and a desire to know future. Perhaps, that is a blessing and the curse. We have an appreciation of the past, only because of the regrets we live with, the baby Hitlers we could -maybe even should- have killed along the way. Regardless, the concept of time travel is here to stay. It is both a symptom and a precursor to our modern society and it is a sign that we have evolved in our thinking, or at least in the way we deal with our own abstract understanding of time.

“The way I see it, if you’re gonna build a time machine, why not do it with some style?”


Revenge of the Sith is mostly Revenge of the Sh… well… you get what we’re saying. However, there is something for which we should give them more credit. George Lucas’ prequels -made between 1999 and 2005- actually give us some surprising insights into our own times, here in the United States. No, we are not talking about the racists overtones of Jar Jar Binks, the rise of the American Order of Jedi, or the fact that Chris Christie is a Hutt. We are talking about the fact that prequels give us a surprisingly realistic insight into the rise of fascism… yeah, its going to be a long one.

…With Thunderous Applause
As Americans we have sort of a general lexiconal idea of Fascism. We banter it around enough that we think we understand what it is, but the truth is that we don’t… and that can be dangerous. The United States, much like the Galactic Republic at the beginning of the Clone Wars is already in the first stages of fascism. Many people will reject that statement, because they think they know what fascism is, and again… they are wrong. People think of Germany in 1940 and say that is fascism: the marching, the swastikas, the vilifying of the free press, the persecution of minorities, the shouting crazy man who holds rallies… hmm… Well, that is fascism too, but remember Germany did not turn into that over night. It was a process that transformed the Weimar Republic into the Third Reich. It was a process that transformed the Old Republic into the Galactic Empire, and if we aren’t careful it could transform us as well.

The word “fascism” comes from “fasces” or “fascio littorio.” That is a bunch of rods tied around an axe. In ancient Rome it was used to symbolize the authority of the magistrate, and was used for corporal and capital punishment. Mussolini and his compatriots in the early 20th century picked the symbol as a show of unity and strength, but the truth of fascism is that it is a movement that is typified not in unity, but in opposition. In Germany and Italy before World War II, it rose in opposition to Communism and Liberalism. That is important to remember, because even though it was called the “National Socialist German Workers’ Party” the Nazi’s were not socialists. It’s a purposely misleading name, like the “People’s Republic of China” or “Award Winning Director JJ Abrams.” Nazis despised the idea of socialism. Fascism is the opposite of socialism. It is about rationing resources for the select and worthy few, the “right” people.

At the start of A New Hope Palpatine disbands the Imperial Senate, ridding the Empire of the last vestige of liberal democracy. That is important, because fascism is a reaction against liberal democracy. The Emperor gives direct control over the systems to the governors and moffs, the strongmen of Imperial society. They were the wealthy, the connected, and the loyal. In essence they were the “right” sort of people. Fascism thrives in societies of strongman machismo and totalitarian one-party rule, but that doesn’t happen overnight. Military coups and juntas are quick and bloody, but electoral authoritarians come to power gradually. They chip away at democratic institutions until they are toothless or non-existent. Fascism isn’t an overnight event. Despite Padme’s on-the-nose proclamation at the announcement of the formation of the Galactic Empire, she did not suddenly wake up that morning and find herself in a totally new government. She had been living under fascism for a long time, but she never realized it.

The Prequels We Don’t Talk About
Robert Paxton, is one of the most acclaimed authorities on the study of fascism and authoritarianism. We giving you a very brief argument here, but if you are interested in learning more you should check out his writings. Paxton, claims that there are five stages of fascism, and they are worth talking about:

  1. Intellectual Exploration, where disillusionment with popular democracy manifests itself in discussions of lost national vigor;
  2. Rooting, where a fascist movement, aided by political deadlock and polarization, becomes a player on the national stage;
  3. Arrival to Power, where conservatives seeking to control rising leftist opposition invite the movement to share power;
  4. Exercise of Power, where the movement and its charismatic leader control the state in balance with state institutions such as the police and traditional elites; and
  5. Radicalization or Entropy, where the state either becomes increasingly radical, or slips into traditional authoritarian rule.

In Star Wars we see all of these five stages. Intellectual exploration happens in Episode I when Chancellor Valorum is ousted with a vote of no-confidence, after it is shown that the Galactic Republic has no way to enforce laws in regards to the Trade Federation’s blockade of Naboo, a core world. This also shows stage two, Rooting, where deadlock and polarization in the senate give rise to Palpatine, a strongman leader who becomes a player on the galactic political stage. Arrival to Power happens when the Clone Wars begin. Palpatine uses the existential threat for the Separatists to rally the senate and the people. The other politicians -Jar Jar Binks- invite Palpatine to have emergency powers in order to control the rising threat and benefit themselves. The fourth stage, Exercise of Power happens throughout the three-year conflict of the Clone Wars. Palpatine rules in balance with the existing laws, while slowly chipping away at them in the name of security. He uses the fear of the people to reduce their rights and make the Galactic Republic into a more militaristic society. We even see the Jedi transform from peace loving monks into battle-hardened generals.

The final stage, Radicalization or Entropy is what Padme remarks on that day in the Senate. Palpatine has successfully won the Clone Wars. He has convinced the people of the evils of the Jedi and the justification for their eradication. He has scapegoated them, and in that zeal of victory he makes the people fear the all-powerful menace of the Jedi. So, with a promise of a greater future, greater protections, and the rebirth of the galaxy, he proclaims the formation of the Galactic Empire. Yet, that Empire had been growing all along. Radicalization comes when the populace fully accept the myth of the fascist state. The Galactic Empire radicalized many of its citizens, but it also forced many more into complacent acceptance of the new status quo. Palpatine accomplished this through a slow series of changes that resulted in a larger radical shift in galactic politics. It was done through cunning, patience, but also fear.

Fear will Keep the System Inline
This brings us to the second part of fascism. It is about excluding people, and scapegoating others for your troubles. In Germany those troubles were heavy economic woes brought about by the Great Depression and the harsh penalties inflicted on the German people after WWI. Jews were the easy target for blame and resentment. They were culturally apart, but they also tended to be fairly affluent. In a way they were like the Jedi, different and enviable. Palpatine blamed the Jedi at the formation of the Galactic Empire and used them as another threat to help unify the people’s hatred. He also “dehumanized” non-humans -such as wookiees- who became slave laborers and second-class citizens. The human population of the Empire was given a place of high regard. They were special, the “right” type of people. This is a page from the fascist playbook.

The Nazis also blamed the international community. Hitler and his ilk isolated Germany from the world, and transformed it into a militaristic society.  At its core, a fascism state is a cult of personality, often centered around a populist nationalism that embraces a rebirth myth. Fascism feeds on people’s fears, but also on their anger. Fascists often walk an odd line between playing the victim and being the bully. It is a philosophy meant to convince the people that they have been unfairly treated by their enemies, and only the strongman, the great leader, is the one who is capable of saving them. As such that means the people can trust no one else, but him. This is often an appealing lie that removes blame and shifts responsibility.

People were drawn to the Nazi cause, because it offered the German people an alternate explanation to their woes. It wasn’t their fault they were poor. It was an international Jewish conspiracy that kept down the German people. In reality, the German people were the master race, the “right” people. Hitler gave them an attractive myth to latch on to. The Third Reich wasn’t conquering the world, they were retaking it and placing the Aryan race back on top where it belonged. Hitler was making Germany Great Again.

That is the appeal of fascism. It is seen as a return to some imagined past, or some fictional right of heritage. We like to think that we -in the United States- are somehow immune to this phenomena. We look around us and say, “We’re not in danger of becoming a fascist country, because no one is out there hanging swastikas and iron crosses. No one is talking about rounding up the Jewish people. No one is goose-stepping or doing the Hitler salute in the streets…” but that’s not fascism, that’s Nazism. Italian fascism had its own trappings, which were different than those of their German counterparts. The most effective symbols of fascism in any country are the familiar ones, the ones that can be twisted to mean something new. These new symbols will resonate with the people of the country. In this country our fascist symbols could take the form of an ultra-obsession with the flag, or a near-fanatical devotion to the National Anthem. They could be symbols as simple as a red hat, or polo shirts, or monuments to slavery… and that’s the point. If you think it cannot happen here, than you would be wrong, because it very nearly did once before.

Not So Long Ago in a Place Not So Far Far Away
The United States was never immune to the dark side of fascism. In July of 1942, a Gallup poll showed that 1 in 6 Americans thought Hitler was “doing the right thing” to the Jews, and a 1940 poll found that nearly 20% of Americans saw Jews as a national “menace,” which was more than any other group in the US, including Germans. One third of Americans believed that there would be “a widespread campaign against the Jews,” and 12% of Americans were willing to support it. The German-American Bund held a rally at Madison Square Garden in New York in 1939, to a crowd of 20,000. It was complete with swastikas, American flags, and goose-stepping. Their leader, attacked President Roosevelt, calling him “Frank D Rosenfeld” and referred to the New Deal the “Jew Deal.” Then there was also William Dudley Pelley, a radical journalist from Massachusetts.

He founded the Silver Legion of America, or the Silver Shirts. They were a truly American fascist movement that ticked off almost every single box of American fascism. He opposed Roosevelt, believed in isolationism, ran for president, and was ultimately arrested for treason and sedition. He also believed in UFO’s and the sort of spiritualism that let you travel to other plains of existence. In essence, he was a laughable figure that no one could take seriously… except that people did, and except that people said the same thing about Hitler right before he took power. Fascist movements hardly start as mainstream. They are fringe groups:,laughingstocks, failed painters, or even reality TV stars who are thrust into power and bolstered by conservatives as a quick way to gain power and/or oppose a rising liberalism.

In the 1940’s American fascism began to grow for a few main reasons, fear of communism, the economic depression, a distrust/fear of the Jews, and as a reaction to Roosevelt and his New Deal, which many perceived as dangerous socialism. So in 2018, is it so hard to believe that fascism can grow again? People don’t fear communism anymore, but they do fear the affects of globalism. Maybe there is no Great Depression, but you do have a lot of downtrodden people who feel forgotten by the government and the mega-wealthy of our time. Also, a modern day fascism is not going to target the Jewish people as their scapegoat. It will target more vulnerable people, like immigrants and Muslims. It is worth noting that Hitler was praised by Christians, like American pastor and Presidential advisor Frank Buchman who said in 1936, “I thank heaven for a man like Adolf Hitler… who built a front line of defense against the anti-Christ of communism.” Ultimately, modern day American fascism will be a reaction to progressive policies and politicians, such as Obama and policies like Obamacare. Yet, most importantly they will not call it fascism or Nazism. They will call it something new… like Trumpism.

Yes, American fascism is already here. We are not Germany in 1941, but what if we are Germany in 1928, or Germany in 1932? What if we are the Galactic Republic right before the Clone Wars? We have a misguided belief that our institutions, our checks and balances will save us, but according to Robert Paxton they have already failed us. The Third Reich didn’t spring up over night. Hitler came to power -without being popularly elected- because the Weimer Republic was plagued with corruption and ineffectiveness. Its institutions failed long before Hitler became chancellor. The Old Republic Senate was ineffective and no longer represented the will of the people, even as one of their own core worlds was being starved and occupied by an army. It failed, long before Palpatine came to power.

That’s No Moon… It’s a Dictator
If fascism is taking hold in America -and it is- than our systems have already failed us. We have an electorate where the overwhelming majority of people support issues like gun control, healthcare, and yet our politicians do not act or make laws in accordance with the majority of that electorate. Instead they act with lobbyists, and we all just take it for granted. We laugh it off and say “well you know politicians.” That’s the same complacency that has allowed fascist states to flourish both on this planet and others. Donald Trump was not popularly elected. He was a joke at first, but its not funny anymore. He has the full support of the American Nazi Party and the KKK. According to a recent study by George Washington University, over the last five years white nationalist and neo-fascist movements in the US have grown by 600% on Twitter, outperforming ISIS in number of followers and in number of tweets.

This is where we would usually end this article with some cutesy and corny Star Wars metaphor or bad joke, but we don’t want to. You may laugh off this article or the things we are warning. You may say we are being paranoid or hysterical, but remember this: The majority of German citizens in the 1930’s were not part of the Nazi party. They were just ordinary people who got swept up. They were as smart and as real as you are right now. The truth that if you -the person reading this- had lived during that time, there is a good chance you would have got swept up too, assuming you were the “white”… err “right” type of people. There is a good chance that you would have been sent off to die for the glory of Fatherland and the Aryan race, and truly believed it was justified. That is Stage 5 Fascism. That is Radicalization.  America is not there yet, but that also does not mean that we are not at one of those lower stages.

The real question is which one?

America First

The Orange in Chief, continues to use a phrase with questionable historic significance, “America First.” He has used it in terms of national security, business and economics, and multiple times both on the campaign trail and while in office. For people of the modern age -and maybe even for Trump at his advanced age of 71- the phrase seems innocuous at first, maybe one part oblivious and one part obnoxious American, but standard enough. The problem is that it is not. The slogan, “America First,” has had a long and problematic history, and to many in our great grandparent’s generation it would have sounded like a dog whistle going off in the dead of night. Let us explain…

The First America First Committee
The America First Committee may sound like a club that promotes making sure the United States is always listed as America, in online forms, because they are sick of scrolling all the way down to the U’s… but its not. The AFC was founded in 1940 and featured several high profile members, including: Henry Ford, Charles Lindbergh, and future US Supreme Court Justice and Boy-Who-Lived, Potter Stewart. Moreover, the committee hosted prominent politicians, businessmen, and celebrities in over 450 chapters across the USA. Some of its funding came from members dues, but the bulk came from private investors such as the German-born manufacturer, William H. Regency, the founders of Sears-Roebuck, and the Chicago Tribune. Now all of this sounds well and good except for one small issue…

The America First Committee was… well… they were kind of pro-Germany, or at the very least Anti-Roosevelt. In public they believed that America should:
A) Not Get Involved in World War II;
B) Stop Sending Supplies to England and France; and
C) Close and Defend the American Border Against Everybody… especially Jews.

Yeah… Now, to be fair it is not like the America First Committee set policy on Jewish refugees leading up to the Second World War, and 83% of Americans at the time were against accepting refugees. However, Charles Lindbergh, the group’s most visible personality, gave a series of a lectures and rallies with the intent of keeping America from helping the British beat back the Nazi’s. On September 11, 1941 -less than 3 months before Pearl Harbor- Lindbergh gave a speech in Illinois that suggested the Jewish communities advocating for the war were Un-American. He continued saying that Jewish “ownership and influence in our motion pictures, our press, our radio and our government” was the “greatest danger” to America. The speech got labeled as anti-Semitic, and Lindbergh himself was labeled as “Pro-Nazi,” which may not have been entirely untrue.

You see Charles Lindbergh was very beloved in Nazi Germany, as he was an international aviation hero, and it didn’t hurt that he was blonde, blue-eyed. and white. The Nazi’s even let him get a sneak peek at their state-of-the-art air force, the Luftwaffe, but in fairness Lindbergh did report what he saw to the US government. Unfortunately, he also showed a great admiration for Germany and the white race in general. He once said, “We can have peace and security only so long as we band together to preserve that most priceless possession, our inheritance of European blood, only so long as we guard ourselves against attack by foreign armies and dilution by foreign races.” Even after America entered the war FDR was convinced the Lindbergh was a Nazi and personally barred him from serving in the war, and Old Lindy wasn’t alone on that list.

Avery Brundage, the former chairman of the U.S. Olympic Committee, and prominent member of the AFC was notoriously anti-Semitic. He had even prevented two Jewish American runners from participating in the Berlin Olympics. Aviator, Laura Ingalls, another public face of the AFC, was arrested by the FBI about a week after the Pearl Harbor attack, and convicted of being a Nazi agent. Henry Ford was similarly problematic. His anti-Semitic views were a matter of public record. He wrote a book titled, “The International Jew: The World’s Foremost Problem.” It was a bestseller in Nazi Germany. He hated immigrants, but especially Jews. He blamed Jewish bankers for everything from the Great War to the Great Depression. He was also uncomfortably pro-Nazi. The Ford subsidiaries in Germany were very profitable and Ford had a great love for German efficiency and manufacturing. Ford was even mentioned in Mein Kampf and received the Grand Cross of the German Eagle from Nazi officials in 1938. It was the highest honor given to any foreigner and represented Adolf Hitler’s personal admiration. Even after war broke out in Europe, Ford kept his relationship with Germany alive and kept his factories there running, even after they stopped making cars and started making tanks and bombs. It is also almost certain that Ford was aware in one fashion or another that his German factories were being worked by Jewish slaves.

Realistically, the America First Committee may not have been a publicly Anti-Jewish/Pro-Nazi organization, but a lot of its more prominent members definitely were.

Make America First Again
Though, we are aware that Donald Trump is not a student of history… or science… or grammar… or business… or really anything useful, someone around him probably is. There should have been someone there to tell him about the uncomfortable correlations between racism, Nazism, and “America First,” and yet he still continues to use the phrase. Of course, the real problem is that there are people around him who get the connections, and who like them. Steve Bannon and Stephen Miller, are given a lot of credit for reviving this particular slogan, and we would bet dollars to racist-donuts at least one of them is aware of history… Not that Donald Trump is no stranger to discrimination first policies, himself.

In 1973, Donald Trump and his father were accused of violating the Fair Housing Act, by discriminating against minority renters. The case was quietly settled in 1975, without the Trumps having to admit any wrongdoing. However in 1978, the Department of Justice renewed their case against the Trumps for housing discrimination, saying that they violated their agreement. The Donald was also accused of the same thing in 1983 by the New York Times, and was penalized in 1992 for removing an African American dealer in one of his casinos at the request of a wealthy client. In 1989, Donald Trump bought full newspaper advertisements, advocating the death penalty for the New York Five, five minority boys who were accused of killing a jogger in Central Park. They were found innocent thanks to DNA evidence. He insulted Native Americans in a Congressional hearing in 1993, and in 2000 he funded a series of anti-Native American advertisements that included images of syringes and cocaine. In 2011, he started the “Birtherism” movement, a campaign to discredit President Obama’s citizenship and Presidency. And this was all before he was a candidate and called Mexicans “rapists,” insinuated a defense of people rallying at a Nazi/KKK rally, tried to ban entry by Muslims into the country -on two separate occasions- and all the rest of his unending list of racial injustices, discriminations, and all the insults he has hurled at minorities and others since he rode down that golden escalator…. but he likes people from Norway.

It is also worth noting that Donald Trump’s father, Fred Trump, was arrested at a KKK rally in 1927. So maybe, the America First apple doesn’t fall too far from the AFC tree. Regardless, of whether or not Trump understands the racial undertones of his America First slogan, he certainly embodies them, both in office and as a private citizen. In fact, with his rhetoric and his isolationist and racial policies we would say he takes the dream of AFC one step farther. At least the America First Committee had enough good sense to publicly cut ties with their more anti-Semitic members -and that one woman who turned out to be an actual Nazi spy- but Donald Trump is both public and proud of his Tweeted policies. So regardless of whatever dimwitted defense his son tries to come up with, we all need to be very clear on thing: Donald Trump is a racist.


To many there is a certain romance surrounding the Confederacy. Maybe it has to do with our affinity for the underdog, or the American rebel spirit? Maybe it is rooted in some notion that the antebellum South somehow represents a simpler and more gentrified time? And maybe to those people, the statues of Southern generals and statesmen are historic relics from a hundred years ago that stand as a reminder to that Myth of the Lost Cause. However, not everyone has that feeling when they gaze upon Confederate statues, nor are they as old as you may think. To many these statues represent a shameful time in our nation’s history. They seek to whitewash the motives of the South, and memorialize a rebel country that literally went to war with the United States. However, these statues also represent something much more to many people around this country, as ever growing symbols of fear, racism, and hatred.

That Belongs in a Museum
According to Southern Poverty Law Center, there are roughly 700 Confederate statues in the United States, spread over 31 states, which -if you’re counting- is 21 more than the 11 states that actually seceded from the Union during the Civil War. There are even Confederate statues in Washington DC, which would probably leave Abraham Lincoln scratching his head… or at least it would have if a bullet hadn’t passed through that very same head fired from the gun of a Confederate sympathizer. In total, there are estimated to be over 1,500 Confederate symbols across America, including highways, schools, and parks. Now regardless of your opinion on the subject, you have to -at least- acknowledge that so many Confederate statues and symbols erected around the USA, is a little odd, especially in states that weren’t even part of the Confederacy. There has to be something more going on than just simple historic remembrance? Right?

Let’s first examine another symbol of the Confederacy that often comes under fire, the Confederate Flag -which wasn’t even the actual flag of the Confederacy, but that’s for another article- In 1956 Georgia redesigned their state flag to include the Confederate battle flag, and in 1962 South Carolina began flying the Confederate battle flag over its state capitol. The thing that really strike us, is that those years seem a lot more recent than we expected. 1956 was the year of Elvis Presley and Marilyn Monroe. 1962 was the year of John Glenn and the Cuban Missile Crisis. Those things are history, but they are not ancient history. They are not 1865 Civil War history. So maybe -just maybe- it is also no coincidence that in 1954 the Civil Rights Movement began, and by 1962 that movement was in full swing. In fact, Lyndon Johnson was only two years away from signing the Civil Rights Act of 1964, and there was a lot of backlash against the idea. Maybe even enough backlash to change a state flag or two. There is no way around it. Confederate statues and symbols -erected mostly in the 20th century– were created as direct responses to the periods of intense racial strife.

We place statues in open spaces and public areas so they can be revered and looked upon. So, what else are people suppose to think when they gaze upon statues of Robert E. Lee, or Stonewall Jackson, or even Jefferson Davis standing tall and heroic in public squares? And yet, these are men who actively and violently rebelled against the United States, and all so they could keep the enslavement of other humans as a legal institution. In any other context, does that sound like men who are deserving of reverence?

Some people like to say that these monuments are about remembering our history, but there are appropriate places to do that. They are called museums, and they work pretty damn well, because they help place events and people within the historic contexts of their times. Heroically posed statues don’t do that. In fact, they tend to do the opposite. There is a reason why Germany doesn’t have powerfully posed monuments of SS officers, or highways named for Himmler or Goebbels. So we have to ask ourselves, “what were the real reasons behind creating these monuments?” Statues do not help you remember history. They help you glorify it, and in this case, that history is the cause and leaders of the Confederacy. People like to say that it’s “heritage not hatred,” but when you look at Confederate statues and symbols, the simple truth is that they are part of a “heritage of hatred.”

KKK, Why’d It Have to Be KKK
After the end of the Civil War, the people of the South did erect some memorials to the Confederacy, but these were small markers of personal remembrance, meant to honor soldiers who had died fighting. It’s the sort of thing a widow would understandably create to remember her lost husband. There were no heroic monuments or statues created to honor the lost cause. Even the famous salve owner, Robert E. Lee, wrote in 1869, “I think it wiser not to keep open the sores of war but to follow the examples of those nations who endeavored to obliterate the marks of civil strife, to commit to oblivion the feelings engendered.” The former commander of the Confederate forces knew that such symbols would only serve to divide a nation that was still trying to heal from the Civil War. Ultimately, he would view our monuments today as a way of keeping open an old wound, and causing further civil unrest. So how did these contentious statues come to be?

North Carolina only erected 30 Confederate memorials between 1865 and 1890. Then between 1890 and 1940, they erected 130 more. In fact Confederate statue construction surged during the early part of the 20th century, with the majority being erected between 1900 and 1920. It is also worth pointing out that this surge coincided with the implementation of Jim Crow Laws, and the biggest revival of the Klu Klux Klan in American history, which boasted almost 4 million members by 1925. Not coincidentally, this trend began almost immediately after the 1896 Plessy v. Ferguson case, where a black man dared to sit on a whites-only train car. The decisions upheld in this case began the practice of “separate but equal,” and following the decision, Confederate statues began to be mass produced, on the cheap. -Sometimes even looking suspiciously like mass produced Union memorials- Any town or country who couldn’t afford a statue could ask for help from the Daughters of the Confederacy.

The UDC was an organization for Southern women who fought to “preserve” the memory of the South’s Lost Cause Myth. Founded in 1894, they were instrumental in raising funds to erect many of the early 20th century statues found around Southern states. By World War I their membership had reached over 100,000 and their influence was greater still. The UDC often promoted textbooks in public schools, like Susan Pendleton Lee’s 1895 textbook, A School History of the United States, which sought to pastoralize the antebellum South, casting it as an idyllic place of manners and gentry. In their revised histories, the Civil War was fought for state’s rights, and not to preserve the institution of slavery. Their textbooks also often proclaimed that “the evils connected with [slavery] were less than those of any other system of labor.” They took a very typical stance of the time that the institution of slavery had Christianized and civilized the “African savage.” They also argued that the Klu Klux Klan was necessary “for protection against… outrages committed by misguided negroes.” This sort of racist whitewashing of history and defamation of African Americans is the same impulse that moved them and others to erect so many Confederate statues around our country.

Eventually, the mad rush to create Confederate monuments tapered off during the Second World War, but that was not the end of it. A new craze of erecting Confederate statues began again in the late 1950’s and peaked in the 1960’s. As you might guess this coincided very closely with the Civil Rights Movement. Once again these symbols were used to remind “uppity” black Americans of what their true place was in the racial hierarchy. These statues were powerful symbols for white people, in both the South and the North to remind those marching in places like Selma where they came from and who was still in charge.

This… This is History?
History does not happen in a vacuum. Both of these two spikes in Confederate remembrances coincided with times of great racial strife in this country. Yet, it didn’t stop in those turbulent times. Racism was not magically solved at some nebulous point in the past, and even today monuments are still being erected to the Lost Cause Myth of the South. Iowa dedicated 3 Confederate monuments, all after 2000. Across the country, 32 Confederate memorials were erected in the past 17 years. So, why are still building memorials to a war that happened over 130 years ago? Why is the North building these monuments at all? Unfortunately, we know the answer to that question, and it has nothing to do with the war, or even the past.

If we really want to “honor” our history, then we need to start by admitting what it truly is, both the history of the war and the monuments we have erected since. Confederate statues are nothing more than propagandist symbols erected to remind the descendants of free slaves who is still in charge. They are objects of power and terror created during times of racial strife. They try to create a romantic and noble image of a war that was fought over the bondage of human beings. That’s not a heroic symbol of the past. In fact, that’s not even a very subtle form of terrorism. If we want to remember history, we should do it in museums or in classrooms. We should strive to learn from the past, not to create symbols that will only serve to repeat it.

Whatever, the Civil War once was; whatever you believe it stood for; you need to know these statues represent something else. They are not about remembering old battles or even honoring old generals. They are reminders of racial superiority. These statues only aim is to rewrite a shameful time in our nation’s past, and let any who gaze upon the stony visages of Lee, Jackson, or Davis, know who is still in charge… even after they lost the war.

Resolute Desk

It’s the Fourth of July and while many of you may be thinking about fireworks, or barbecues, or Bill Pullman speeches, we here at The NYRD have been doing a lot of thinking about desks. -Yes, we are the kind of people who spend our vacations thinking about office furniture- However, we’re not talking about just ANY desk, we’re talking about the Resolute Desk. It is a desk you know and admire, even if you have never heard the name before or seen a terrible Nicholas Cage movie. So, sit right back, grab a hot dog, light a sparkler, and let us learn you some history.

A Three Hour Tour
Our story begins with the HMS Resolute, a British arctic ship in service to Queen Victoria in the 19th century. We know… stay with us. The Resolute was dispatched as part of a five-ship expedition in April 1852 to search for Sir John Franklin. You see, Franklin had left Britain in 1845 looking for the fabled -and non-existent- Northwest Passage through the Canadian Arctic, and after 8 years his cable bills had started piling up. So the British Royal Navy decided that maybe it was time to go looking for him. The expedition was met with mixed results, in that they never found Franklin and the commander, Edward Belcher, abandoned four out the five ships. The Queen wasn’t too happy about that last part, so Belcher was nearly court-martialed and never received another naval commission again.

Our story doesn’t end there. The HMS Resolute continued to drift in the ice where she was encased. In 1855, she was discovered by an American whaler named James Buddington and his whaling ship the George Henry. -Now, whaling is not necessarily a worthy profession, but we’ll look the other way on this one- Buddington and his crew freed the Resolute -because: free ship- and sailed her back to Connecticut. Upon learning of the Resolute’s recovery and return to non-frozen land the British government called the whole thing a wash and relinquished all rights and ownership over the vessel.

In the early to middle 19th century America and Britain had a tense relationship. Even after the War of 1812, where they sort-of, kind-of, burned down our White House- tensions continued to flare for the next several decades. They were never crazy about the Monroe Doctrine. We also, inadvertently got tangled up in a Canadian Rebellion, there was some dispute over borders in Maine, more disputes over borders in Oregon, and an assorted other problems that made relations between the two Atlantic countries a bit problematic. It was so bad that Congress even talked about going to war… again with the United Kingdom. However, that all changed with the Resolute. Congress voted to set aside $40,000 to refit the Resolute and sail her back to London. In 1856, a United States Navy Captain presented the refurbished ship to Queen Victoria as a gesture of peace and a way to ease tension between the two countries. The gesture worked… at least for a little while, and the relationship between the UK and the US improved.

A Desk Made for a President
Lady Franklin -Sir John Franklin’s wife- wanted to use the Resolute to go searching for her husband, who at that point had been missing for 11 years. Queen Victoria refused, as he was almost certainly dead, and she feared that any damage done to the Resolute would also damage American-British relations. So the HMS Resolute was kept in safe harbor until she was broken up in 1879. From her timbers, Queen Victoria had four desks made, A lady’s desk, two small writing table, and the Resolute Desk. The first three desks have found their way to museums, and the Resolute Desk was presented by Victoria to president Rutherford B. Hayes in 1880. -Little know fact: The B is for Bigalow-

The Resolute Desk was seen as another gesture of peace, and as such the desk has been used by every President since 1880, except Johnson, Nixon, and Ford. It started its tenure in the second floor office of the White House, but in 1902 it was moved to the newly constructed West Wing. -Back when it still have that new Martin Sheen to it- Yet, it still remained in the second-floor residence as a personal desk for use by Presidents. In 1952 Truman moved the desk to the new broadcast room, which is kind of like an ancient form of a podcast room. He used it when making radio and television broadcasts. When President Kennedy was elected he moved the desk into the Oval Office in 1961. After he “left” office, Johnson selected another desk to use in the Oval, and the Resolute Desk spent  a year in a Kennedy Library traveling show before being moved to the Smithsonian in 1966. That is where it would have stayed if President Carter hadn’t requested it be placed back in the Oval Office in 1977. Every modern President, except for George H. W. Bush, has since used the Resolute in the Oval Office.

The desk was modified twice in its lifetime. The first modification was requested by Franklin Delano Roosevelt. He wanted a privacy panel created for the front, because he was a little sensitive about people seeing his legs… for some reason. Unfortunately, he did not live long enough to see the modification completed and had to suffice with putting a trash can in front of the desk to hide his lower-regions from view. The panel that was put on the front of the desk was created with the Presidential Seal at the time, which means it’s one of the last existing seals to show the eagle’s head pointing toward the gripped arrows instead of the olive branch, indicating war over peace. Truman was the one who changed the seal to move the eagle’s head to point at the olive branch, at the end of World War II. The second modification was to create a base for the desk to raise it up… because people nowadays are taller.

The inscription on the desk’s plaque reads: H.M.S. ‘Resolute’, forming part of the expedition sent in search of Sir John Franklin in 1852, was abandoned in Latitude 74º 41′ N. Longitude 101º 22′ W. on 15th May 1854. She was discovered and extricated in September 1855, in Latitude 67º N. by Captain Buddington of the United States Whaler ‘George Henry’. The ship was purchased, fitted out and sent to England, as a gift to Her Majesty Queen Victoria by the President and People of the United States, as a token of goodwill & friendship. This table was made from her timbers when she was broken up, and is presented by the Queen of Great Britain & Ireland, to the President of the United States, as a memorial of the courtesy and loving kindness which dictated the offer of the gift of the “Resolute’.

It is also worth noting, that despite what conspiracy websites and terrible Nicholas Cage movies tell you, there is no twin Resolute desk sitting in Buckingham Palace or any other British Royal Residences.

Legacy of the Resolute
The HMS Resolute is a legacy of peace. The ship itself was given as a gift of peace and the Resolute Desk was then given as another gift of peace. Since it’s arrival in the US, the Resolute Desk has become a symbol synonymous with the power and office of the President, and that is important. It is a symbol of peace -despite that the seal on it indicating war- and it should be a reminder to anyone who sits behind it that the United States always works better when it is making allies instead of enemies. -Also, it once made for that cute JFK picture, but we’re getting off-track. That symbolism of peaceful relations is something worth remembering, and it should be something that every occupant of that desk takes into consideration.

After all, the President of the United States does not sit on a throne. The symbol of his power is not a giant golden chair located in the center of some great hall where he issues decrees and proclamation with absolute authority. He is not a king. No, this symbol of the American Presidency is, a desk, an object meant for work. A desk is a place where people sit, and write, and read, and make compromises. It is a place that a dedicated professional uses to accomplish an important task. It is the place of an American President, not the throne of an American King, and that is something we should all keep in mind as we celebrate this patriotic holiday.

Civil War

“Why was there the Civil War? Why couldn’t that one have been worked out?”

This question has never been asked before. It took the biggest best genius and the most successful President of ever to actually think to ask such a meaningful and poignant question. But, of course, it makes sense that Donald Trump would be the one astute enough to ask the hard questions of history, questions that college professors, historians, and third graders would never think to ask. So we, here at The NYRD, will attempt to do our civic duty and be the Google that Donald Trump doesn’t seem to have. So, why did we “Civil War?”

Slavery. It’s Slavery.
Case closed, right? It was slavery. The Civil War was fought over slavery. Any idiot not sitting in an a round shaped office in some whitish building would know that, right? The quick answer is: “Yes,” and the longer answers is: “Yes, but…”

Slavery was certainly the main catalyst, despite what Confederate reenactment actors tell you. In fact South Carolina’s Declaration of Succession, mentions slaves or slavery 18 times in one form or another. In contrast, it only mentions the word “Union” or the words “United States” 16 times combined. A lot of South Carolina’s grievance can be distilled down to this line: But an increasing hostility on the part of the non-slaveholding States to the institution of slavery, has led to a disregard of their obligations, and the laws of the General Government have ceased to effect the objects of the Constitution. Mississippi wrote: A blow at slavery is a blow at commerce and civilization. When Texas seceded they wrote: That the servitude of the African race, as existing in these States, is mutually beneficial to both bond and free, and is abundantly authorized and justified by the experience of mankind, and the revealed will of the Almighty Creator, as recognized by all Christian nations… Which is pretty screwed up.

As you can see, Mr. Trump, it took some hard-investigative digging, but we were able to come up with a passable answer, slavery. -If anyone asks you, in the future, just say slavery– However, we do need to acknowledge that the Civil War was not just an open and closed case of the North saying, “Slavery is bad,” and the South going “God doesn’t care.” As the quotes above suggest there were other factors coming into play, States rights versus Federal rights, economics, and even religion. So, this leads us into that “Yes, but…” territory we were talking about earlier.

It’s Still Slavery, But…
One the eve of the Civil War some 4 million African Americans were enslaved in the Southern States. This represented a significant economic factor for Southern elites and their plantations. The ruling class of the South needed to keep slavery in order to keep their wealth. Meanwhile, Northern States had abolished slavery one by one, mostly because it was just no longer financially reasonable. Industrialization and a wave of immigrants -specifically from Ireland and Germany- made slavery obsolete in the North. Abolitionist movements grew in the cities and urban centers, and publications like Uncle Tom’s Cabin and the Dred Scott Case further influenced politics of the region. As such this drove a division between Northern industrial states and Southern slave states.

As America grew, Southern States and their congressional representatives wanted to expand slavery into the new western territories, which Northern States opposed. It is worth mentioning that this was not so much about morality, as it was about political power. Creating more slave states would tip the balance in Congress toward the South, but creating more non-slave states would tip the balance toward the North. Slavery became the lightning rod of American politics, and by the mid-1800’s everyone was forced to pick a side. There were attempts at reconciliation, like the Missouri Compromise, where Missouri was allowed entry into the US as a slave state, but only when Maine was admitted as a non-slave state, but they often didn’t last long. Ultimately the question of slavery was irreconcilable. As Abraham Lincoln said, I believe this Government cannot endure, permanently half slave and half free.

Lincoln was an anti-slavery northerner. Our first Republican President –Most people don’t even know he was a Republican. Does anyone know? Lot of people don’t know that– won without a single southern electoral vote, and once in office he made it clear that the institution of slavery would not be allowed in the western territories. On top of that, many Northern States and abolitionist disregarded laws, like the Fugitive Slave Law, and others that were passed as a further compromise to keep balance between the North and the South. States like South Carolina and others saw all this as a violation of their State’s rights. Yet, we need to acknowledge that at the core these issues was slavery. It was a financial, class-based, and even religion-based institution that was ingrained in the Antebellum culture of the Southern States.

Think of the Civil War like a McDonald’s hamburger. First of all, we didn’t really want it and it wasn’t really good for anyone’s health. -The Civil War had a terrible calorie count- Yet, in the end, it doesn’t really matter what sort of bread or other dressing we try to apply to hide the real meat of the issue. Slavery, the meat-byproduct patty, is always at the center of it. It’s the main reason we buy a hamburger, and slavery is the main reason we had the Civil War. Without that cause it would have been just been a lot of salad and State-Rights-Mystery-Sauce.

Jacksonian Promises
In a lot of ways, the tensions of the 1800’s mirror today’s tensions. Today we don’t argue over slavery, but other key and majorly divisive issues -one of which is still race relations. Our country is split, not along northern and southern borders, but along class, gender, and economic lines. Today’s “Civil War” is more about the city versus country, or the coasts versus the center, or Progressives versus Conservatives. Mr. Trump, it is a little ironic that you so often paint yourself as a Jacksonian figure. -Most people don’t even know he was a Democrat. Does anyone know? Lot of people don’t know that- Jackson was an embattled and impatient President who rode a populace wave into the White House, and wound up hurting the American economy and hardening relations between whites and other races, specifically Native Americans… Hmm.

Yet, like Lincoln you find yourself faced with a divided nation, and since you like to equate yourself to Jackson, it is worth wondering then how you would have negotiated away our Civil War? What would you have done if you had been you in the power instead of Abraham Lincoln? Would you have let the South keep their slaves? Would you have allowed 4 million human beings, and their descendants to remain in bondage just to keep a tense peace? Would you have allowed slavery to expand into the West? Would you have let the Southern States secede? Would the United States of America, today, be bordered on its south by the Confederate States of America? What would the world and our nation look like had you been at the helm?

You claim Andrew Jackson -and by proxy, yourself- would have handled the Civil War better, but how? Jackson was a man who pledged support and troops to Aaron Burr when he tried to build his own Empire. He invaded Spanish territory without authorization, and nearly started a war between Spain and the United States. He started a banking war that strangled American business interests. He was a proud slave owner, and even placed advertisements for the capture of a runaway slave. He forcefully removed 125,000 Native Americans from Georgia, Tennessee, Alabama, North Carolina, and Florida. He forced them to march the Trail of Tears, in defiance of a Supreme Court ruling. He was responsible for the largest holocaust in American history. Is that who you wanted in charge in our nation’s darkest hour? Is that who you claim to be, Mr. Trump?

Anyone who does not know history is doomed to repeat it, and if there is one thing you have made abundantly clear, it is that with you in charge we might be doomed to repeat a lot of it.


Words matter. We’re not just saying that because we’re eloquent writers… and stuff. No, we’re saying it because: words matter. Language is a social contract that exists between all of us and with that agreement comes a certain amount of trust. We judge people based upon how they speak. We tell and read stories to entertain and inform. We trust language as something firm in our lives, and we also tend to believe the things we see written in headlines, in the news, and by our own government. That is why ideas of “Alternative Facts” can be so scary, and why propaganda has been so effective throughout our history.

Nobody in History Ever Lied
The origin of the word, Propaganda, can appropriately be traced back to the Catholic Church. The Sacra Congregti d Prpagand Fid or Sacred Congregation for Propagating the Faith, was established is 1622. Its objectives were to convert pagans and propagate the Catholic faith. From that came the word we know today, but it is worth mentioning that the word itself did not have a sinister meaning until later in history. It is also worth mentioning that just because the phrase was coined in 1622, that does not mean that the art of deception and promotion did not exist before that time either.

For instance, Julius Caesar wrote and published the Bellum Gallicum between 58 BCE and 49 BCE. They were Caesar’s own first hand accounts -told in third person- of his many victories in the Gallic War. It is very likely many of his writings suffered from at least some embellishment, as the real purpose of the documents were to influence and win favor with the common people in Rome. Caesar knew that if he could influence the commoners to love him, then the Senate could do nothing against him, especially when he eventually marched into Rome at the head of an army and was declared emperor. You see, leaders -whether on Twitter or by other means- have been exaggerating their accomplishments throughout history, not just for narcissistic reasons but as a tool to control others.

That is worth remembering when someone like Donald Trump plants people in the audience of his news conferences, with the explicit purpose of laughing and applauding on cue. That is worth remembering when someone like Donald Trump inflates his own importance and victories on social media. That is worth remembering when someone like Donald Trump refuses to believe or even acknowledge the existence his own words and failures. That is worth remembering when someone like Donald Trump believes that the rules do not apply to him. Propaganda is the art of making opinion of the powerful reality for all, and that is worth remembering too.

There Has Never Been Any Propaganda in War
The tactics of propaganda are not necessary when despotism reigns. In a time before democracy, dictators and supreme leaders did not need to convince their people to do anything. The people did it because they had no choice. However, in a place like Ancient Athens, propaganda became a way of life. The citizenry was actively engaged in the game of politics, and were conscious of their own interests. That meant in order to sway the strong-minded citizenry the Greeks used everything from games, to the theater, to the assembly, to religious festivals to extol their ideas or denigrate those of their opposition. These machinations and subtle manipulations became a simple way of life, and it has continued into modern democracies ever since.

Propaganda has become even more important in today’s world, where democracy is now the dominant form of government. The most obvious examples come in times of war. During World War I the US Government created the Committee on Public Information, to keep the American people committed to fighting the Kaiser and his “evil” German hordes. During the Great War the CPI first used fact, but quickly started embellishing those facts to make American victories seem more impressive. They gave stories to the newspapers, such as how “the First Division to Europe sank several German submarines.” The story was easily proven false when reporters interviewed the commanding officers of the division and learned they had not even encountered any German submarines. The CPI used newspapers, posters, and new technologies like radio, telegraphs, and even the movies to promote their pro-war agenda. They had scores of “four minute men” who were trained to go to social gatherings and talk favorably about the war in conversation. However, this heavy handed campaign backfired and the American public became highly critical of the obvious propaganda tactics of the organization. It was about this time that the word propaganda also came to have negative and sinister undertones.

During the Second World War, many Americans came to associate the term with fascist regimes, such as Nazi Germany and Imperial Japan. So, in Word War II the government instead subsidized the Writers’ War Board. It was an independent agency that expressly promoted government policies through art, literature, and even comic books. The WWB made movies with big celebrities, sold war bonds, and created pro-American posters to support the war effort. Officially, the US Government took the stance of having no propaganda, but the civilian led WWB has been called “The greatest propaganda machine of all time.” It was so good that -as a nation- we still internalize a lot of Wolrd War II through the images and ideas that were first created by the WWB. In essence their propaganda became part of our history and our culture.

However even after the war, concepts of propaganda could still be found in most civilian life. If you are a child of the 80’s or 90’s -as we are- you may remember the Ad Council, and it might surprise you -as it did us- that it was established in 1942 to drum up support for the war. The Ad Council then moved on to domestic issues for the government and private agencies. They became a form of domestic propaganda, working on everything from those famous “War on Drugs,” commercials to their “Campaign for Freedom,” which promotes the War on Terrorism. However, domestic propaganda does not only come from the government. It also comes from companies trying to sell their products, or even news agencies trying to boost their ratings. In 1898, we even invaded Cuba -in part- because William Randolph Hearst and Joseph Pulitzer sensationalized the sinking of the USS Maine to sell papers. In some ways, their actions were no different than Fox News declaring that there is a “War on Christmas,” -which considering that Christmas advertisements start going up on October 15, there clearly is not.

Words Don’ Matter
Everything we have mentioned so far as been pretty obvious propaganda. Most people would look at the examples of World War II, the sinking of the Maine, commercial advertisements, or even Reagan’s War on Drugs, and probably agree with us. However, it is the subtle manipulations that tend to be the most effective and most sinister. Word usage and word choice can change your brain. Words like peace and love can strengthen our frontal lobes and promote cognitive functioning. While, negative words can increase our fear centers and produce stress hormones. So labeling something like a “War on… Anything” will put most people into a fight or flight mode. However, labeling a law that expands government surveillance and reduces civil liberties as the Patriot Act, will put people at ease. Arguably it is also catchier than “The Government Taps Your Phone Act.”

We tend to think of propaganda as grand campaigns of misinformation, but the truth is that they don’t have to be splashed on posters or on your TV screen to manipulate how you think and feel. Words are powerful. They shape our perception of the world around us, and changing words to have different meanings or falsely labeling actions or laws can have an incredible impact on people, even as we know it is happening. These days, the manipulation of language has never been so apparent as than during the infancy of the Presidency of our current President Infant. Donald Trump is a man who labels his opponent as “Crooked Hillary,” or “Lying Ted.” He -despite his limited vocabulary- is actually a master at using subconscious secret code words. Words like “get” or “achieve.” He uses props as a tool of distraction, because of course he does. Remember Donald Trump is a reality TV producer. He knows how to put on a show.

Now you have his surrogates on TV openly lying about easily disprovable facts and then calling them alternative. In our opinion, this is perhaps the most dangerous and chilling thing that has come out of the Trump White House. Calling lies by any other name is how reality starts to warp. Remember, words have power, and the phrase “Alternative Facts” is already trending. We laugh at it now, but it is entering the lexicon like a slow moving virus. If it gains ground than it will give Donald Trump and his team a safe and reliable place to hide their lies. We cannot let that happen. We cannot play their propaganda game. We need to call things what they are, and a lie is a lie.

Donald Trump Lies for the Good of the Nation
According to The Oxford Companion to American History, the word propaganda is defined as: “the deliberate attempt by the few to influence the beliefs and actions of the many through the manipulation of ideas, facts, and lies.” In the end, even the word propaganda is a work of propaganda. It is a word used to downplay what is essentially a campaign of manipulations, lies, and falsehoods. In the past, we have justified certain actions as propaganda, because we understand their end game. The Catholic Church wanted to propagate their religion. America wanted to support their efforts in the World Wars. Businesses and advertisers want to sell you things. However you feel about those goals -right, wrong, or indifferent- at least we understand them.

The problem with Trump’s new alternative facts is that they only seem to serve one end goal, the ego of Donald Trump. They do not serve a national good or even a bottom-line. They are all about Trump and how he wants us to perceive him. This is extremely worrying. When propaganda only serves the needs of one person’s ego you get countries like North Korea or Zimbabwe. Remember, Julius Caesar used propaganda not for the good of Rome, but for the good of himself, and a few short years later the Republic of Rome was never the same. We are not necessarily condemning or condoning the practice of propaganda itself. To an extent it is a fact of modern life, but when it is used to prop up the perception of one individual above all others, especially a -now- powerful world leader, than that is when we start to worry.

So it is worth remembering that these are not alternative facts, and we will refuse to call them as such. History has shown what happens when you stop calling a lie a lie.



Well, 2016 has come and gone and it has been filled with some ups, and a lot of downs.

Here at The NYRD we have been hearing a lot about this “social media” thing that has been sweeping the world wide web, and decided, “Golly gee, maybe we should check that out.” So we thought it would be fun to post this year’s events as a typical Facebook feed, and quite frankly we are amazed no one has ever thought of doing this before… ever.

In doing so, we have to come to realize that this “Facebook” thing is truly the wave of the future. Unfortunately, we also quickly realized that we created one of the most depressing thing to appear on your Facebook feed since your Uncle Elliot started his vlog. Yet, without further adieu, we give you 2016, in Facebook form.


Happy New Year, everybody. We will be back in 2017.


electoral college

It’s time to pack your bags, get your books, and load up the car, because we are off to college. No, we’re not talking about the type of college where you sit in a classroom, live in a dorm, and get up to outdated stereotypical 90’s hi-jinks. We are talking about the Electoral College. We can only assume that there is less drinking… though maybe not this year. Our Electors have been in the news a lot recently, but before we judge them on their actions or inaction it will probably be beneficial to go back and look at the system as a whole, from a historical point of view.

Keg Stands for Democracy
According to Article II, Section 1, Clause 2 of the US Constitution: Each State shall appoint, in such Manner as the Legislature thereof may direct, a Number of Electors, equal to the whole Number of Senators and Representatives to which the State may be entitled in the Congress: but no Senator or Representative, or Person holding an Office of Trust or Profit under the United States, shall be appointed an Elector.

It may not have escaped your notice that there is nothing in there about popular election. That is because the Framers were not crazy about the American population voting directly for the President of the United States. Instead, they saw the President being elected more like how the Pope is elected, through the College of Cardinals -Go Fighting Cardinals!- Electors were meant to be the most knowledgeable and informed individuals from each State, and they were meant to select the President regardless of state or party loyalties. Before we go any further, you need to understand that this process was established not because the Framers thought the American population was stupid, -Well, everyone but Hamilton anyway- but because the Framers were dealing with different issues and fears that we don’t even consider today.

We have talked before about how our Founding Fathers were more concerned about issues we don’t even think about anymore. The Electoral College was set up because the Framers were dealing with thirteen colonies all jealously guarding their own power and fearful of a federal government. Our system was therefore meant to be a balance between states’ rights and federal authority. It was believed that if the President was elected through a popular vote, the public would not have enough information to make an informed decision. After all, at the time the the population of the US was 4 million people, all spread down a thousand miles of Atlantic seaboard. The fastest form of communication was a man on a horse. So the Founding Fathers believed that people would just end up voting for the “favorite son” of their own state, and nothing would get accomplished, or the vote would always go toward the states with the most people. So the Electoral College was created as a way to safeguard the rights of smaller states and assure the governors and legislators of all the states that they had a say in picking the President.

Now, you may still think it is a stupid system, but remember that you are looking at it through 21st Century eyes. When the Constitution was written, the world was a different place. Back then, the President did not have the kind of power he has today. In fact, until the 1930’s the President’s power was limited. Aside from a few exceptions, such as Lincoln and Roosevelt, Congress was seen as the more powerful entity. It is also worth mentioning that people like Washington hated the idea of political parties. Madison and Hamilton believed they were inevitable, but thought they would still be amicable toward one another. They created the Electoral College to be a tool of state’s rights, not for the benefit of political parties. The Framers did not anticipate the hyperpartisan world of 2016, and they did not foresee America being split by red and blue states.

Learning in College
Here is the thing, the system never really worked, even in the beginning, and the cajoling and backdoor politicking it encouraged had some pretty poor consequences. The Electoral College had a hand in the Election of 1824 where John Quincy Adams was elected over the more popular Andrew Jackson, and it may even be -at least- partially responsible for getting Hamilton killed. By the 19th Century it was pretty clear that the Electoral College needed to be changed, and it was. After the Election of 1800, and the rise of political parties, the 12th Amendment empowered the Electors to cast only one vote for a political ticket, instead of the two individual votes -for President and VP- they originally cast. Also, electors became selected by the voters, as opposed to the state legislators. By the mid-century all the states were voting for their electors making it a permanent tradition in US elections, but still not technically a law. Currently, 29 states have laws that force electors to vote based upon the popular election result, making the electors all but honorary positions.

As you can see, the Electoral College has never been static. It has been shifted and amended to deal with many new aspects of the growing nation, but it is still not the same as a populist election. Even during the debacle of 1800, the idea of moving to a popular vote system was not really considered. The horrors of the French Revolution tainted the idea of populist rule for a lot of the founders. In fact, even as early as 1788 people like Alexander Hamilton were rapping about the dangers of a populist movement: The process of [electoral college] election affords a moral certainty, that the office of President will never fall to the lot of any man who is not in an eminent degree endowed with the requisite qualifications. Talents for low intrigue, and the little arts of popularity, may alone suffice to elevate a man to the first honors in a single State; but it will require other talents, and a different kind of merit, to establish him in the esteem and confidence of the whole Union. Hamilton envisioned the Electoral College as a place where the most qualified political thinkers gathered and had a serious discussion over who was best suited to be President, so as to avoid demagogues from being able to ride into office on a swell of ridiculous promises made to an overly-zealous electorate. In France, that sort of populist movement led to the guillotine, but in America it has now led us to something far more dangerous and with much worse hair.

It is also worth mentioning that slavery played a part in the continued existence of the Electoral College -because of course it did. This is America and our demons haunt every institution we own- At the nation’s founding, James Wilson from Pennsylvania proposed and argued for a direct vote over the Electoral College, but James Madison of Virginia argued against it, The right of suffrage was much more diffusive in the Northern than the Southern States; and the latter could have no influence in the election on the score of Negroes. In other words, the Northern states had more free-white men of voting age than the Southern states did, and a direct election would result in more Northern victories. However, when counting non-voting slaves as two-fifths of the electorate population -even though they were not allowed to vote directly- than that gave Slave states an advantage when it came to the number of Electors they received. Thus, the slave state of Virginia became more power in the Electoral College system than the free state of Pennsylvania. That might also be why four out of our first five Presidents were from Virginia.

Final Exams in History
So what is the point, professor? Well, think about this: four times in our history this system has put the unpopular candidate in office over the popular one. By almost ever metric the Electoral College is broken. It does not even protect small states or low population areas from the power of big cities and large states. If anything it encourages candidates to spend most of their time campaigning in just a few swing states, while neglecting the larger country. If we had direct elections, than candidates could not afford to miss the “fly-over” states anymore than they could afford to miss New York or Los Angeles. Even worse, the system disenfranchises voter turnout. Voting Republican in California or Democrat in Texas feels like throwing your vote away, because it is. That is bad. People don’t show up to vote in national elections don’t vote in local elections either, and those are arguably more important. In a direct system, every vote would matter, no matter where you live, and that is a lot more incentive to go to the polls.

After the 1800 election, the 12th Amendment irrevocably changed the way we elect our President. Among other things, it openly acknowledged the influence of political parties and empowered them to select one candidate for President and one candidate for Vice-President. This idea literally ushered in the possibility for a populist President. It laid the ground work for the Electoral College we know today, not Hamilton’s idea of a room full of thoughtful electors, but just people nominated by political parties to rubber-stamp the predetermined election results. It created the idea of a popular vote in all but practice. Perhaps even more ironic, in 2016 the Electoral College system functioned exactly opposite as Hamilton intended. It did not prevent the election of a populist demagogue, but instead ensured it. If we had been using a direct vote system Donald Trump would have lost by over 2 million votes.

But this article is not really about Donald Trump. No, it is about Hamilton and Washington and Madison and Adams and all the rest. We have to remember that our Framers empowered us with the ability to change the constitution as we saw fit, because they may have gotten the Electoral College wrong but they still knew what they were doing. They lived in a world of 4 million Americans spread across thirteen colonies with only a few dirt roads connecting them. They could not envision the rise of the Internet, or transportation, or cable news networks, or even political parties -which only took less than 8 years to fully form. Yet, they gave us the tools to amend our founding document because the world changes and our needs inevitably change with it. That was why we ratified the 12th Amendment, and maybe that is why we need to change the Constitution again to do away with the Electoral College.

Our Friends over at Shortcut have clued us into this amazing and interactive history of online gaming. So check out their work and take a tron cycle ride through the world of games online. From dial-up bulletin boards to augmented reality headsets, online gaming has leveled up a lot since the 1970s. These advances wouldn’t have been possible without hardware breakthroughs and the evolution of the internet itself. Our thanks to the people at Shortcut for sharing this graphic and letting us share it with you.


Today marks the arrival of Marvel’s premier magic user, Doctor Strange, but you don’t need to look into the pages of a comic book to find examples of wizardry and magic. No, in fact one of the most famous sorcerers is a man you may have never heard of, and unlike the newest Marvel hero to be played by Benedict Eggs Cumberland Batch, this strange doctor actually existed. So join us as we delve into the wild, weird -and historical- world of John Dee, the royal magician of England.

An Ancient One
Dr. John Dee was born on July 13, 1527, and lived for 81 years. During that time he was a mathematician, a cartographer, a physician, an astronomer, and a philosopher, but he was also a man who dabbled heavily in astrology, alchemy, divination, and all matters of the occult. It has been speculated that Dee was the inspiration for Faust, the magician that made a deal with the devil, and even Shakespeare’s Prospero from The Tempest. Both tell of a man who sought occult knowledge and power at all personal costs, and though that is not a totally accurate assessment of John Dee, it is also not a completely unfair one either.

Firstly, the 16th century was generally not a great time to be a scientist to begin with and Dee was above all a scientist. It was a time of transition when the Catholic church still viewed things like math and science as possible heresy. So for Dee, dabbling in alchemy was just as dangerous as dabbling in algebra. In 1555, Dee was tapped to cast horoscopes for Queen Mary and her -then imprisoned- half-sister Elizabeth… because there was no psychic hotline back then. Dee predicted that Elizabeth would take the throne and have a long reign and Mary… not so much. As you might imagine the sitting queen at the time was not too pleased with that prognostication and had Dee immediately thrown in jail. He did eventually exonerate himself to the royal court and to the Catholic church, even earning a close friendship with the priest who administered his “religious examination,” which sounds worse that the SAT’s. In 1556, he even presented Queen Mary with a plan for a national library in order to preserve books. That plan was rejected, because… We don’t know why. Either way, he instead focused his attention on massing a personal library, which became the largest in England at the time.

In 1558, Queen Mary died and Elizabeth took the throne as Dee predicted. One of the new queen’s first acts was to name Dee the Royal Astrologer and basically gave him a royal pardon to conduct and work any sort of magic/science he might want to. Basically, anything he did was labeled as “white” magic and was considered sanctioned by the crown. In return, Dee became one of her most trusted advisors. His knowledge of cartography and navigation laid the groundwork for English voyages of expansion, and it is said he even coined the term, the “British Empire.” Dee became a celebrity scientist, -the Neil deGrasse Tyson of his day- a man whom everyone believed had all the knowledge of how the world and the cosmos worked. Only one person disagree, Dee himself, and it drove him further into the practices of witchcraft.

The Orb of Agamotto
It is speculated Dee became frustrated with his failure to grasp all the “secrets of nature,” and the definitive mechanisms of how the world and the universe worked. So he turned to a tome called the Steganographia, which is an infamous 15th century “black” magic manuscript that promised to help him finally accomplish his goals. Dee spent several years unsuccessfully trying to contact angels through a practice know as “scyring.” -Where he got to look at the top cards of his library, and then put any number of them on the bottom of his library and the rest on top in any order… Oh wait, that’s the wrong type of Magic– Eventually, he claimed that the angel Uriel appeared to him and gave him a crystal ball, which he could use to directly contact angelic beings, but doing so left him drained and unable to remember the conversations he had with the heavenly creatures. So he brought in another scryer who had greatly impressed him, Edward Kelley.

By all accounts Kelley was practiced at the art of conducting seances and other spiritual contacts. He was also a known counterfeiter, and very possibly a conman, but in a very short time he became Dee’s closest advisor and companion. Thus, Kelley would commune with the angels, while Dee recorded the sessions and the words that were spoken. Many of which were written in the “Enochian” language, which Dee claimed was an angelic language. Dee described hundreds of conversations with biblical angels, monstrous creatures, and divine women of grace and beauty. The scrying sessions became an obsession for Dee. He believed that with Kelley’s help he was unlocking the secrets of the universe for the betterment of mankind. Unfortunately, Jane, Dee’s wife believed that Kelley’s intentions were not as pure.

During this time Dee and Kelley fell into favor with a Polish nobleman, Albert Laski, who was invited to sit in on some of these scrying sessions. This, unfortunately, also meant that the Elizabethan court began to suspect Dee of conspiring against the crown and dabbling in things that were unnatural. He was almost certainly being spied on by agents of Elizabeth. Laski, however, persuaded the pair of magicians to come to Poland. Dee was dubious of the trip, but through some “prompting” from the angels and Kelley, eventually convinced him to go. Unfortunately, what Laski had failed to mention was that he was broke and out of favor with the Polish court. There was no warm welcome for them in Poland and in 1583, Dee, Kelley, and their families began a nomadic life of traveling around Europe giving audiences and seances to various nobility, including the Holy Roman Emperor. However, Dee was never fully trusted by the aristocracy of Europe, especially the Pope who labeled him as a heretic. Others just believed -as some still do today- that Dee was a spy for Queen Elizabeth and thus the families were never truly welcome in any one place.

Return to the Sanctum Sanctorum
John Dee should have listened to the warnings of his young wife. In 1587, while in Bohemia, Kelley had a sudden “revelation” from one of the angels that proclaimed that Dee and Kelley needed to share all possessions, most importantly their wives. At this point Kelley, was the more trusted of the by the courts of Europe, especially by the Holy Roman Emperor. He had gained renown as a alchemist of some talent and worth and claimed to have the secret to the Philosopher’s Stone -or Sorcerers Stone as Rowling calls it for American audiences. Dee, on the other hand, was blinded by his belief in the scrying sessions and his overwhelming desire to gain the knowledge of the angels. He agreed almost without question to an act that -at the time- was an extreme form of sin and black occultism. Almost, intermediately after the wife-sharing incident Dee broke off his association with Kelley and returned to England, but nine months later a son was born. It was very likely the progeny of Edward Kelley and Jane Dee.

Things only got worse upon returning to England. There, Dee found his home of Mortlake in disarray. The estate was vandalized, his famous library was ruined, and most of his valuable scientific instruments were destroyed or stolen. Kelley continued on in Europe as an alchemist gaining further favor and fame in the Holy Roman Empire -and he was eventually made a Baron… before being imprisoned and executed some years later. Dee, however, found little fame, fortune, or even a quick death in England. Due to his occultism London and the royal court were no longer a hospitable place. Perhaps in recognition of all he did to help her secure her throne and establish the British Empire, Elizabeth did appoint Dee Warden of Christ’s College in Manchester, but he had little control or real renown in that position. After Elizabeth died, James I took over the throne and treated Dee with even more contempt and less support. Dee spent his last few years penniless and broke. He died in Mortlake in 1608.

Dr. Strange Tales
Dr. Stephen Strange, is the most powerful sorcerer in the Marvel Universe. He often consults occult books, spiritual powers, and casts all manner of spells that would have gotten him imprisoned or worse in 16th century England. In some ways he is the foil of Dr. John Dee. Strange’s hubris and lust for knowledge took him from a egotistical surgeon to the protector of the world. Dee’s journey for knowledge did nothing but pave his path to hell with golden intentions.

In our world the life of a historical magic user is often less glamorous and less magical than a comic book character, but Dr. John Dee was a scientist without limits. His study of the occult was not so much occultism as it was about furthering his scientific interests and inquires. He did not see himself as practicing magic, but conducting research into the science of angels, astrology, and alchemy. Over the centuries, since his death, John Dee has become somewhat of a legendary figure in the occult circles of modern life and writers from Marlowe to Shakespeare have taken his example to use as cautionary tales of great men who seek unsafe knowledge. Yet, the life of John Dee was more than an allegory of power. He was a brilliant man, a scientist, and one of the chief architects of the Elizabethan era. But then again, who knows?

By the Mystic Moons of Munnopor, maybe out there in the multiverse John Dee still exists and laughs from his hallowed sanctuary as some sort of real-world Sorcerer Supreme.

Star Trek

Star Trek is turning 50 this week. The classic franchise that has always been about future people doing future things in a Galaxy far far… oh wrong one… where Kirk, Spock, McCoy and all the rest boldly go where no split infinitives have gone before. The Original Series spawned eleven movies, four more TV shows –plus one more coming in the Fall– and has become a cultural touchstone. The series’ message of hope for humanity and its ability to tackle weighty matters through classic science fiction storytelling has become a staple of the franchise, unless Jar Jar Abrams is in charge. Over the past five decades Star Trek has had its stumbles and flops -aka The Final Frontier– but it has always given us more than enough quality to make-up for the bad.

However, it has also given us something else, incorrect predictions about our future. By the very nature of a show like Star Trek, it had to make some assumptions about where humanity was heading. That means through backstory, set details, and other clues Star Trek has predicted some strange things for our present world. Some of them were not far off, some of them were very far off, and some were just strange. So in honor of fifty years of living long and prospering, let’s take a look at 50 years of predictions that Star Trek has made about our own time.

1968 Orbital Nuclear Weapons
According to Assignment Earth The Unites States of America launches a nuclear weapons platform into orbit above Earth. In the real world this didn’t happen, obviously. A nuclear weapons platform in orbit would have unbalanced the Cold War and possibly ignited a global war. In the episode it was done so that the Enterprise -which had traveled back in time- had something to contend with and use as a lesson to show the “primitive” 1960’s humans that nuclear weapons are bad. It also, worth mentioning that the episode aired on March, 29, 1968. So we’re also hoping that no one on the writing staff had government clearance enough to know something we don’t.

1986 Transparent Aluminum
During another time traveling escapade Kirk and crew travel back to 1986 to steal two whales… because reasons. However, in order to accomplish that Scotty gives an manufacturer the blueprints to design transparent aluminum, which is basically a tougher form of glass. Scotty needs to manufacture the material as a tank for the whales… again for reasons… so he gives the formula to humans of 1986. This whole thing was treated basically as a gag for the movie, The Voyage Home, but it is worthy of this list because in 2015 the US Naval Research Lab actually invented Transparent Aluminum. So Star Trek was right, they were just 19 years off.

1987 The New York Times Closes
Another throw away line from The Voyage Home claims that the New York Times Magazine closes its doors in 1987, as one of the last newspaper magazines of its time, which is a pretty ballsy statement considering the movie came out in 1986. Maybe the producers just didn’t like the New York Times. It is also worth noting that they were not completely wrong, just a little too early. Newspapers and news magazines are closing up shop quicker than ever these days thanks to the Internet, however the New York Times Magazine is actually still in production.

1992 Eugenics and Genetic Engineering
The biggest glaring prediction for Star Trek was their prophecy of the widespread use of genetic engineering by the year 1992. That is the year Khan Noonien Singh… KHAN!!!… rises to power in the Middle East and the Eugenics Wars begin. According to Space Seed and Wrath of Khan, humanity created a race of augmented humans, called Augments. These genetic supermen rose to power in various Middle Eastern and Asian countries the in 1990’s… because Bill Clinton… maybe… Khan at one point held power over a “quarter of the world.” The Eugenics Wars were a series of conflicts between the various Augment dictators of some forty nations. Normal humans eventually rose up and overthrew the Augments in 1996, condemning most of them to die as war criminals. Khan and 84 of his followers escaped Earth aboard the cryogenic-ship SS Botany Bay.

As you can tell none of this actually happened. In fact, the biggest news in genetic engineering to happen in 1992 was that China was the first country to introduce a virus-resistant tobacco plant. With the mapping of the human genome the benefits and risks of Human genetic engineering are still being debated in the science community today, but we are no closer to actually creating genetic supermen than Kirk is to successfully resisting the temptations of any green-skinned woman.

1994 Cryonics and Cryogenics
Speaking of cryogenics… According the Star Trek: Next Generation episode: The Neutral Zone, by 1994 cryogenics are so widespread and safe that people are willing to having themselves frozen at the time of death, and even stored on satellites until cures for their diseases can be found sometime in the future. As you may have guessed, we have not quite perfected cryonics or cryogenic preservation for humans. The closest we have come is being able to freeze human embryos in cryogenic stasis. There is however that portion of people freeze who their brains when they die, Walt Disney style… which our lawyer has reminded us to tell you is actually a myth.

1996 Life on Mars
In the Star Trek Voyager episode: Future’s End, it is briefly implied that scientists discovered ancient microscopic Martian life in 1996. The episode was filmed several days after the NASA announcement in August of 1996 of possible fossilized evidence of microscopic life from a Martian meteorite. However, that claim has never been confirmed fully and as of this article there is still no solid evidence of life ever existing on Mars.


2001 The Millennium Gate
Another Voyager episode: 11:59, depicted the construction of the Millennium Gate. Construction began in 2001 and it was completed in 2011 as a way to commemorate the beginning of the 21st century. For some reason it was built in Portage Creek, Indiana and was a tower 1 kilometer high and 3.2 kilometers wide. The building was a self-contained biosphere with its own ecosystem and over six-hundred stores for shoppers to enjoy. It was covered in solar panels and eventually served a model for the first Martian colony. The Millennium Gate became a national landmark on par with the St. Louis Arch or the Empire State Building and it could be seen from space. This marvel of modern engineering was -of course- never actually built. The tallest building in the world is currently Burj Khalifa in Dubai, standing tall at only 830 meters in height. it is also doubtful that anyone will be looking toward it as a model for a Martian colony.

2002 The Nomad Interstellar Probe
According to Star Trek: The Original Series, in their episode: The Changeling, in 2002 Earth launched the Nomad probe, as our planet’s first interstellar probe with the mission to seek out extraterrestrial life. Of course in typical Star Trek fashion this comes back to bite Kirk and crew when the probe encounters an alien intelligence, gains sentience, and goes on a killing spree. However, as of 2016 we have yet to launch the Nomad, but Voyager 1 entered interstellar space in 2014, making it the first man made object to leave our solar system. And currently there are talks about creating the Starshot project, which might be able to propel a series of small probes to Alpha Centuari in a single human lifetime.

2015 Planetary Baseball League
In the Star Trek Universe by 2015 baseball had become such a popular worldwide sport that Major League Baseball was supplanted by the Planetary Baseball League, which included teams from across the planet, such as the London Kings, the Crenshaw Monarchs, and the Gotham City Bats. -Most likely that last one was meant as a Batman joke- One of the most notable players is Buck Bukai who breaks Joe DiMaggio’s 56 consecutive game hitting streak in 2026. In 2032 the Yankees win the World Series, and the last world series is officially held in 2042, after people’s interest in baseball fades. It is almost humorous that Star Trek created a world where baseball became anything but an American sport, especially since the last time baseball was played in the Olympics was in 2008. As Star Trek predicted the sport is growing less popular, but we doubt it will ever have enough fame to actually get a professional team from cricket-loving London.

2018 Sublight Propulsion
We suppose this one might be true, but it seems doubtful. In Space Seed, it is said that by 2018 sublight propulsion makes cryogenic sleeper ships obsolete. This could be true, considering that “sublight” is literally any sort of propulsion that goes slower than lightspeed. We have some pretty ingenuous forms of propulsion in space, including light-sails and ion drives. However, the bulk of our propulsion is still done through chemical rockets and we still do not have an engine that could get us to another star system in a shorter time than it would take to make the trip using the -also still fictional- cryogenic sleeper ships.

Other Future Predictions
Star Trek also has a few predictions for the coming years including:

  • 2024: Ireland Reunification – Northern Ireland becomes part of the Irish Republic, which could happen thanks to Brexit.
  • 2024: French Political Strife – France becomes unsafe for tourists thanks to battles between “Neo-Trotskyists” and “Gaullists.” Ironically, -and chillingly- France is currently facing similiar declines in tourism thanks to recent terrorism.
  • 2024: Sanctuary Districts – Sanctuary Districts are set up in major cities across the US and the homeless and poor are separated from the rest of the population and put into ghettos for the destitute and jobless. This is the strongest evidence to show that in the Star Trek Universe, Donald Trump was elected President.
  • 2026: World War III – The Third World War lasts until 2053 and results in nuclear genocide, population cleansings, and the near destruction of most world governments… Thank you, President Trump.

So we can look forward to that, but -all in all- Star Trek has been an amazing and sometimes weird ride though history, science, and imagination. Despite the fact that their history and our present don’t always line up we can still take the lessons of Kirk, Picard, Sisko, and the rest and apply them to our time. After all, warnings of a fictional World War III might be the best way to prevent it from actually happening. Our Earth has not suffered through or created the same things as Star Trek’s Earth, but that does not mean we cannot share in their sense of hope for the future. We may not have had the Eugenics Wars, but who knows what the future might hold? One day we might have space travel, Starfleet, the Federation… and maybe even a London baseball team.


You may have heard people say that “Race is a social construct,” similiar to language, national boundaries, or Hogwarts’ Houses, and much like Hufflepuff, the concept is one mired in identity, economics, and power. Understanding the history of the labels that we wear and assign is about understanding the history of shifting social classes, politics, oppression, and even slavery. Make no mistake -in the end- race is and always has been a social construct, but it is one of the biggest and most heavily reinforced collective ideas in the history of humanity.

A Slave to History
Slavery was not a new concept in the world by the time Europeans settled on the American continents. Ancient Romans, Greeks, Sumerians, Egyptians, and others had kept slaves for centuries and passed their tradition onto the cultures that followed. However, slaves in these times were not delineated based upon the color of one’s skin. Instead, being a slave often meant that you were a prisoner of war, captured by pirates, or just in any circumstance where you were not recognized as a “citizen of the nation.” In fact, wealthy Romans often kept Greek slaves as highly sought-after tutors and house servants, because in antiquity slaves were also valued for their intellectual abilities as well as their physical attributes.

At the time, slavery also existed in similiar forms for the native populations on some Pacific Islands, Africa, the Americas, but especially white Northern Europeans. Warring tribes would often take prisoners from their defeated neighbors and force them into varying degrees of servitude. The word “Slave,” even comes from the word “Slav,” because during the Middle Ages -when the English language was taking its modern form- some of the most common slaves were prisoners from the Slavonic tribes captured by the Germans. They were often sold to Arabs, meaning that it would not have been uncommon for Middle Easterners to have white slaves. The French Crown even enslaved its own people, filling their war-galleys with French Protestant rebels who were forced to row the mighty ships into battle. However, all that changed with the introduction of colonialism.

By the late 15th and early 16th centuries, Portugal had begun to open up trade with the nations of Sub-Sahara Africa. Initially, Europeans were more interested in African ivory, diamonds, and other riches, but also purchased the African prisoners that were captured during wars between African nations. Thus, when the Portuguese began building the colonies of São Tomé and Principe and setting up Caribbean sugar plantations it was the African slaves they relied on to do the bulk of the work. The Native Americans populations often died of illness or were able to escape and disappear, knowing the land and the local tribes. African slaves were ideal as they were immune to European disease and were strangers in the New World. This led to an influx of African workers in the Americas not just for Portugal, but for England, Spain, and other growing colonial powers.

However, this also led to a growing moral dilemma for the Christian nations of Europe. Originally, slavery was justified because Africans and others were non-Christians. In Spain it started with the Inquisition, where non-Christians were determined to be less than human. Others rationalized the practice of black slavery by using a passage in the Bible about Ham who committed a sin against Noah. His black descendants were condemned to be “servants unto servants.” However, as more and more missionaries and pastors converted free and enslaved Africans alike the religious rationale found itself on shakier grounds. After all, how could one be expected to enslave another human who worshiped Jesus? In 1667, Virginia created a law that stated that Christian Africans could be kept in bondage, not because they were heathens, but because they had heathen ancestry. It was believed that God had marked them as “mongrels.” From that point forward slavery started to be about race, not religion. Blacks became something less than human in the eyes of powerful whites. Where once indentured white servants worked side-by-side with black slaves -often intermingling and marrying- after the 1600’s laws were created that prevented white and blacks from intermarrying or creating mixed “race” offspring.

White Makes Right
We are not claiming slavery was ever okay, but before the age of colonialism slavery was a more of a local matter. Yet, with the discovery of the New World, it became big business. Suddenly, the dehumanizing of Africans was a matter of profit and that meant governments, businesses, and the powerful white men of the world had a vested interest in making sure the myth of race became solid fact in the minds of all Europeans and Americans. It was a matter of profit that white people thought of African slaves as entirely different biological entities, beings who were unlike them or their wife or their child. After that it became only a matter of time before classifications were applied to anyone else who was not “white,” such as Asians, Natives, Indians, Muslims, Jews, Italians… wait what?

The term “white” is a purposely nebulous term. It does not actually define any type of ethnic or national group. “White” was created basically to mean “Normal.” Anyone who was non-white was the “other.” They were not normal by the standards of the established white power structure. Jews, for instance, -despite being light-skinned- were often considered as something less than white. As far back as medieval times, Jews were demonized as witches and forced to flee countries in the face of Christian prejudice. Before the 1800’s most immigrants to United States were from Northwestern Europe: England, Ireland, Spain, France, Germany, etc. By the end of the Civil War and well into the 20th century, American started seeing more immigrants from Southern and Eastern Europe: Greeks, Italians, Russians, Polish, etc. These people despite their complexion were still seen as non-white. They had odd customs and spoke different languages. Italian Americans were even lynched in 1891 in New Orleans. Despite initial antagonism, Italian Americans and most European immigrants have since been accepted into the “white” power structure. This is partly due to their assimilation but also partially due to the mass of Latin American, Indian, and Asian immigrants that arrived during the mid to late 1900s. In comparison, Italians and Polish no longer seemed so strange, so they became “White,” which at least was a more generalized and benign classification than the word some Americans used for white people before… oh… 1940 or so.

We don’t use the “Aryan” anymore due to obvious reasons, but we did. In fact, to a lot of European Americans it was a source of pride and a bestselling 1907 book. Make no mistake the word was very much tied with racial superiority even before the Third Reich. Funny enough, we do still use the word “Caucasian,” which is less “goose-steppy” but no less self-aggrandizing, inaccurate, or meaningless. Caucasian comes from the Caucasus area that borders Europe and Asia. That is not where all white people live nor where all white people originated. In 1795, Johann Friedrich Blumenbach picked it as a term to represent white Europeans because he wanted to underscore the beauty of the white-skinned. It also has a lot of mythological intonations, featured in aspects of Jason and the Argonauts. So, really it is just another way to say that “white” people are better than the rest, but that idea of biological superiority is as scientifically false as the myth of Caucasus.

The Science of Prejudice
Science is not bigoted, but scientists and thinkers can be, and that has played its part in the myth of race. The idea of polygenism, started with philosophers in the 1700’s, like Blumenbach or Immanuel Kant. Pseudo-science like phrenology developed around the same time as a way to prove that other races were intellectually inferior to white people. It was also used to justify the subservience and “timidity of black slaves.” Pieter Camper in 1770 measured faces and declared that Greco/Roman statues -the “ideal” human- had a 90-degree facial angle, Europeans an 80-degree angle, Blacks a 70-degree angle, and orangutans a 58-degree facial angle. Thus, he believed that he had established the hierarchy of mankind.

After phrenology was debunked, the 20th century turned toward eugenics. Once again, pseudo-science became popular as the rich and elite white population justified their own status through biology. It also use to explain why white people could never be allowed to “pollute” their gene pool with black DNA, lest the children inherit undesirable genetic traits like “criminality” and “pauperism.” Apparently being poor or crooked were a genetic trait in the 20’s and 30’s. It also led to sterilization of undesirable populations. Those who were believed to be mentally impaired, black, Mexican, and Asian were coerced or forced to be sterilized in the United States, so that their genes could not corrupt the “American race.” Thankfully, eugenics and sterilization fell out of favor after a man named Hitler became the poster child for the movement. Yet, even up to the 1970’s as many as 25% to 50% of Native American women had been sterilized.

For the record, most individual humans vary from each other genetically by .1%. 85% to 90% of that variation has to do with your family and genetic heritage. Only 10% to 15% of that variation has to do with what continent your ancestors originated from. That means an Irish American could be more closer -genetically- to a Kenyan American than they could be to someone in Ireland. “Race” does not exist, biologically speaking, and even if it did how do you differentiate between “Black” and “White?” After all, most African Americans have at least -on average- 16.7% of European DNA. At what percentage does someone stop being “Black,” and start being “White?” 40%? 50%? 80%? Or does it really have to do more with our social perception than any actual biological makeup?

Fade to Black
The ideas of “Black” and “White” are so impossibly vague. The only difference between the two is that out society values one over the other. For instance, it is common for people to point out that Barrack Obama is half-white, but would that get pointed out so frequently if he was a convicted drug dealer? No, because we have been conditioned by centuries of social reinforcement to believe that “race” exists, and since we cannot define it in precise biological terms we instead define it socially. Black is associated with “criminality,” “pauperism,” and “low intelligence.” Yet, the idea that one complete subset of the population is preconditioned to be, act, or do certain things is, scientifically and ludicrously untrue. If you don’t think so, than talk with Neil deGrasse Tyson and see what his take on the stereotype of black intelligence might be.

Race is such a deceptive and insulting word. It implies something biological that is not true. Elves, Dwarves, Faeries… these are races. They have night vision or +2 Strength, but humans of varying skin color have no different advantages or disadvantages over one another, besides the normal delineations between one human individual and the next. “Race,” plain and simple, is a social construct. It was created by wealthy white men to justify an economic system of slavery and reinforced by bad science and a prejudicial power structure afraid of losing social and economic status. It only has the power and truth that we chose to award it, which means much like Faeries, if we stop believing in it, maybe it will finally lose its power.


So, our time on the road is coming to an end. As we make our way back to the Big Apple we find ourselves reflecting on our weeks of traveling across America. Sure, we could dwell on all the fun we had, all the interesting sights we have seen, or on all the things that went wrong. Yet, that has not been our real takeaway from this experience. Ultimately, our journey has always been about the people. We have met people from New York to New Orleans, from Chicago, IL to Fayetteville, NC, and for the most part everyone has been kind, caring, and amazing. It has not mattered whether we were in a blue state or a red state, Americans have proven to be truly special people, and that is something worth remembering in today’s climate.

Gateway to Understanding
The news media has this tendency to cast everything in a bad light. We get it, good news doesn’t sell. However, sometimes this leads to a skewed perspective on the world. Talk to anyone who does nothing but stare at their Facebook feeds all day long. “The world is coming to an end.” To look at all the bad news and to watch what is going on out there you might think these are the end times. After all, violence is up, the economy is out of control, Donald Trump is being elected President. Surely the four horsemen are not far behind. Yet, that’s not the America we found out there, and its also not the first time we have thought like this either.

Traveling across the nation is also about traveling through our history. We spent some of our days visiting Native American sites, Civil War battlefields, and even Dollywood. When walking through such historic places it is almost impossible not to find yourself reflecting on the good the and bad of our collective history. That was especially apparent in St. Louis, where the Dredd Scott case took place. For anyone not familiar, Dredd Scott and his wife, Harriet were slaves who sued for their freedom. The case was lost, but the Scotts were eventually granted their freedom by their masters. The Supreme Court case made national headlines at the time, and proved to be a polarizing issue in the lead up to the Civil War. The court ruled that the Scotts were property and had no right to sue under the US legal system, but the legal battles raised awareness of the issues surrounding slavery and the decision helped galvanize support for the Emancipation Proclamation.

Similarly, while we were touring the famous St. Louis courthouse and learning about the case, one of our team picked up a newspaper. It was a souvenir reprint of the original newspaper that was published on the day the St. Louis Arch was completed. The main story was -of course- the towering new monument, completed in 1965. Yet, once you got pass that fluffy and inspiring article a further examination of the paper showed nothing but stories of political scandals and news of Vietnam. After all, 1965 was not exactly a calm year for the United States. So, picture holding that newspaper in your hand on the day it was printed. Despite the main story, it would not have been hard to look at all the scandal, war, conflict, and violence and believe that, “The world was coming to an end.”

Road Tripping Across History
That is kind of our point. Maybe we always think the world is coming to an end? Maybe that’s how it always goes.

In hindsight, we know that the world didn’t fall apart in 1965, and that slavery eventually did end, as did the Civil War, World War I, and the Cold War. It may seem that we live in a crazy world of turmoil, politics, terrorism, Pokémon, and heaven knows what else. It may seem that this is the worst time the world has ever faced. There are school shootings, and the government is coming for your guns, and taxes are out of control, and liberals are running the White House, and conservatives are running the Congress… But when you really look at history you begin to realize, that may just be the human condition. After all, wouldn’t you have felt the same if you lived through the Civil War? Didn’t they feel the same when Pearl Harbor was bombed? Didn’t people feel like the world could end any moment if the Russians ever dropped that bomb? And don’t even get us started on the 80’s -that was just one agonizing fear sandwich of a decade.

Our point is that during the Dredd Scott case all the slave owning white people probably thought the world was turning itself on its head, in the same way many evangelical Christians probably feel about Marriage Equality. When it was finally decided all the abolitionists probably screamed about the “backwardness of the country.” During the Civil War Abraham Lincoln had people who hated and blamed him as much as he had people who loved him, much like our current President. The Korean War, the Vietnam War, the War in Afghanistan, the Iraq War… and maybe, we are a country doomed to repeat our own history. Maybe we are a people doomed to repeat our own panic attacks, because we always think our time was worse than anything before. Maybe it is time we all took a step back and looked at the world in comparison to our ancestors.

Two Steps Back, One Look Forward
The truth is that the world and America are going pretty well. According to ForeignPolicy.com, “combat deaths are the lowest they have been in 100 years.” We’re smarter than ever before. We’re living longer than ever before. Violent crime is way down. The number of people living in poverty has been cut in half in the past two decades. 22% of the world is getting energy from renewable resources, The US deficit has been cut by nearly 50% since 2009, and our taxes are among the lowest in the developed world. We have smart phones, and the Internet, and Netflix, and the ability to travel to anywhere on the globe in hours. We have rockets and robots on Mars, and even robots that have left our solar system. Our medical care is better than any time period before, and childhood death is so low that any incident has become unthinkable, which is something that was not true even 100 years ago.

Objectively, any person from any other time period would look at our world and claim that we lived in paradise, and yet all we see is the darkness. We tend to focus on the bad because of our perspective bias. We don’t know any other time or any other place, so we have no way to compare our personal experiences to that of someone else from another era. Couple that with the fact that we tend to look at the past with nostalgia. We see our childhood through rosy-colored glasses. We like to think it was amazing, and the world was better back then. Yet, while we were happy and content playing stickball -or whatever- there was racism, and war, and sexism, and poverty. We look at the past as better, but it really wasn’t. So we believe the cable news networks, or the nostalgic listicles, or the orange polticial candidates that tell us the world has “gone to hell,” and that “America was better in the old days,” because we feel as if it is true. It’s not.

Listen, we are not saying that is still not problems that are in dire need of fixing, because there are. Climate change is real, systemic racism still runs rampant, and terrorism is one of the defining problems of our times. All we are saying is that maybe we can try and dwell on the good sometimes too. On our trip, we met a lot of Americans of all religions, races, and political persuasions. We learned that they are all good and decent people, reasonable in their beliefs and their respect for life. We sometimes tend to construct these bloated ideas about other places or we demonize other people, and we never bother to verify those assumptions with our own experiences. Well, we here at The NYRD have done just that, and we can reassure you, life is pretty sweet, but there is no place like home.

New York, here we come.


During our journey across this great nation we decided to take a random pilgrimage to visit the Tomb of Lincoln. In order to enter you have to answer the Riddle of the Three Headed Mary Todd, and defeat the vengeful ghost of Stephen Douglas… in a standard format debate. Still it was worth it to gain entry to see the actual tomb of Lincoln, where his body finally came to rest. We say finally because the matter of Lincoln’s body is one of those strange and quintessentially American tales.

A Procession of Mourning

Lincoln, the young years.

Our sixteenth President died on April 15, 1865, and he eventually came to rest in Oak Ridge Cemetery in Springfield, Illinois, but getting there was a bit of a pain. You see back before television it was determined that the people of the United States should have one last chance to say goodbye to their hero President. So Lincoln was extensively embalmed, and on April 21 loaded onto a train that was adorned with a giant picture of him. The train and Lincoln then made a 1,600 mile journey to cities such as Philadelphia, Boston, New York, and hundreds of other smaller stops. The train carried dignified guests, including the body of William Wallace Lincoln, who died from typhoid at age 11 and was being moved with his father to be buried in Springfield.

At each stop the corpse was unloaded placed on a black carriage and paraded to a spot where the public could arrive to view it for several hours or even days. Unfortunately, there was a flaw, and that was the lack of the invention of refrigeration. Embalming only does so much, and by the time Lincoln reached New York and went through a marathon viewing, the dead President was not looking so well. The corpse was exposed to the air for 23 hours. By many accounts, when Lincoln reached Springfield he was looking like something more fit for a horror show than a somber funeral.

At least he finally reached Springfield to be laid to rest in Oak Ridge, and you would think that would be the end of the story.

Grave Indecencies
On, November 7, 1876 -election night- a band of grave robbers attempted to exhume Abraham Lincoln and hold the corpse for a $200,000 ransom -about $4 million in today’s money- and the release of a fellow counterfeiter. Their plan was foiled by the Secret Service, a group which Lincoln created. They did, however, manage to get the lid off the President’s coffin. So after that, the coffin traveled to a number of secret locations between 1876 and 1887. It had to be opened multiple times to confirm Lincoln’s identity, which sounds like a fun job.

It was Robert Todd Lincoln, the President’s oldest son, who finally suggested surrounding the coffin with a 10-foot steel cage and ecasing it with cement, as if they feared the undead wrath of the Great Emancipator. In 1901, the body finally came to rest in its current tomb in the Oak Ridge Cemetery, accompanied by Mary Todd and the majority of their children. Then, after 1901, the tomb started receiving a lot of curious and patriotic visitor. So, the Egyptian-like tomb was expanded in 1930 to add more rooms for visitors, and in 1960 it became of America’s first National Historic Landmarks.

Visiting Lincoln
Finding the tomb of Lincoln is not hard, trust us. It towers above everything in the cemetery and is decorated with ornate statues depicting heroic recreations of the Civil War and other moments in the President’s life. When you walk inside the massive monument you realize that Lincoln -much like the ancient Egyptians- was buried with a friendly tour guide. A very nice man is employed solely to sit in the tomb all day and direct visitors about which way to go.

When you enter the tomb you follow the circular path around to see various statues of Lincoln at different parts of his life, as a debater, a soldier, a lawyer, and finally as the President. The path ends at the tomb of the man himself, a giant marble slab bearing his name and surrounded by flags. The body itself is interred 10 feet below the floor, but you can still feel as if Lincoln is in the room, silently judging the poor decisions that led to the election of Donald Trump. The flags arrayed around the coffin are of the state that the President lived in and the states that his current descendants live in.

20160717_091345That’s it. Then you stand their awkwardly somewhat thankful for all the concrete and steel between you and the vengeful embalmed zombie corpse. After you are done you can find the bust of Lincoln that sits in front of the tomb and rub its nose for the good luck that Lincoln never had. Then you get in your car, get lost in a bad neighborhood of Springfield, and then eventually find the highway and move on.

All we can say is that the impressive monument and the strange story of Lincoln’s final burial is a true testament to the man’s mystique and prestige as a President. During the hard times of the Civil War, Honest Abe, was very much like Batman. He was the hero the United States needed, but not the one it deserved. He was a silent guardian standing vigil during our darkest night, and we suspect that if it wasn’t for all that concrete and steel keeping him in check, he might still be.

As for us, on to our next adventure…

Until next time, keep watching our adventures on SnapChat at thenyrd.

July 4th is coming up and that means, barbecues, fireworks, and an annual re-watching of the Jeff Goldblum/Bill Pullman classic, Independence Day. Yet, even though July 4, 1996 is a historic date in humanity’s contact with extraterrestrial life, it is not the only entry in the history of alien invasions on this planet. People have been seeing little green menaces for years and, unfortunately, not all of them have been welcomed to Earth with a Will Smith-sized fist.

Mars Attacks
In order to understand humanity’s fascination with aliens we should really turn the clock back to Percival Lowell, who in his 1895 book, Mars, proclaimed that the Martian surface was covered in canals created by an advanced Martian civilization. Lowell, was not the first person to talk about these “canals,” that honor goes to Giovanni Schiaparelli, who first discovered the “canali” or channels, by observing the planet through his telescope. However, Schiaparelli stopped short of attributing them to any sort of civilization or sentient construction effort. Lowell, on the other hand, wrote three books on that very subject and captured the imagination of the public with the possibility of alien life. It wasn’t long after that when H.G. Wells -the British Roland Emmerich- published War of the Worlds, taking the idea of a Martian civilization to new and London destroying heights.

Suddenly, beings from the sky no longer seemed as friendly or as inviting. The populace was given images of alien walkers parading through Europe, blowing up landmarks, and generally being rude house guests. In response, the people of Earth were suddenly seeing Martians everywhere. In 1897, Alexander Hamilton, a farmer from Kansas -and not the founding father/rapper- reported the first incident of a UFO cow abduction and mutilation. Hamilton told of witnessing an actual alien craft that took his cows and left them butchered. The story was first run in the local newspaper, but was eventually picked up nationwide. It wasn’t debunked until 80 years later when an elderly Kansas woman admitted that she had heard Hamilton bragging about how he had made the whole thing. Yet, the damage had been done. In the popular subconscious, Martians and little green men were on Earth and they had a taste for beef.

On October 30, 1938, Orson Welles -no relation to HG- and his Mercury Theater troop performed an updated version of War of the Worlds as a fake newscast on the radio. The broadcast began at 8:00 pm, but being the golden age of radio, most Americans were listening to the ventriloquist Edgar Bergen and his dummy “Charlie McCarthy” on NBC and only turned to CBS at 8:12 pm after the act was over. That means they missed the announcement at the beginning of the show that marked the production as a “fake broadcast.” So, what American listeners found when they switched the channel was what sounded like an extremely convincing emergency newscast. As many as a million Americans believed what they were hearing -obviously forgetting that it was the night before Halloween. Panic broke out, especially in New Jersey, where the first “alien walker” had been said to land. One woman in Indianapolis was even reported as running into a local church where services were being held and yelled out, “New York has been destroyed! It’s the end of the world! Go home and prepare to die!”

Battle: Los Angeles
The biggest twist in the story of War of the Worlds is that each version -the Wells and the Welles version- were in some way prophetic of events to come. HG Wells talked about a massive war that would engulf Europe and destroy its cities. More than a decade later the planet found itself fighting just such a war, The Great War. Orson Welles’ War of the Worlds talked about a foreign enemy invading and reeking havoc on a peaceful and isolated America. Less than a five years later, the United States would be embroiled in World War II after just such an attack at Pearl Harbor. Even Steven Spielberg’s version seemed to predict Tom Cruise’s religious views, but in the end that has always been kind of the point of good science fiction. It often works as a reflection of ourselves and the tensions in our society. Maybe that same concept is also why we are most vulnerable to stories and hoaxes during times of turmoil.

BattleLosAngelesAt 3:16 am, on the morning of February 25, 1942, the skies over Los Angeles lit up with anti-aircraft fire. When all was said and done, the military had fired more than 1,400 rounds, and eight people were dead, five from falling shrapnel and three from heart-attacks. Yet, no aircraft wreckage was ever found and there was no indication that anything had been attacked -other than by falling shrapnel. A picture published by the LA Times showed search beams focused on a patterns of light, possibly emanating from the bottom of some massive craft. This is what became known as the Battle of Los Angeles, and to this day people still claim it was an alien spaceship that triggered the air raid response.

World War II saw an increase in UFO or “Foo Fighter” activity. That was partially because of the stresses of war, partially because air superiority was so important, -and everyone was looking up for possible threats- partially because of possible Nazi super-weapons, but mostly because of a time traveling Dave Grohl. However, most Foo Fighters have been explained away over the years and the Battle of Los Angeles is no different. Experts seem to agree that the air raid was triggered by a weather balloon that was sighted by a nervous sky watcher. The massive response was actually understandable. It had only been 79 days sine the Japanese attacked Pearl Harbor, and 24 hours since a Japanese submarine has surfaced near Santa Barbara and shelled the oil fields in that location. The city was on high alert, America was expecting another attack, and it only took one sighting of a balloon to set a match to the tinder. Most experts believe that the balloon probably popped and sank into the Pacific. The famous photo of the alien craft, on the other hand, can be explained by lens flares that had been “touched” up by a photo artist at the Los Angeles Times, a common practice before the invention of Photoshop.

The Day the Earth Stood Still
The sad truth is that Earth has never been invaded by aliens and we will probably never get a chance to use snappy one-liners as we casually defeat them with a computer virus that infects their oddly MS-DOS based computer systems. However, that does not mean we have stopped looking. The SETI program was established by NASA in 1959 to begin searching for signs of intelligent life elsewhere in the cosmos. In its time it went through several funding problems and eventually became a private endeavor, but it is still alive and kicking today, and mostly likely manned by a Hawaiian shirt-wearing geek playing office golf while listen to REM. On Aug. 15, 1977, the Big Ear radio observatory at Ohio State University received a 72-second transmission coming from the direction of the constellation Sagittarius. The signal was 30 times more powerful than the average radiation from deep space, and Jerry Ehman, who was watching the stat printout at the time, circled the anomaly and wrote “Wow,” next to it. This became known as the Wow Signal, but it was never duplicated or found again, and there were no links ever established to an alien civilization.

SETI is not the only tool humans are using to look for aliens that can be seduced by Jeff Goldblum’s chest hair. The Keplar spacecraft is a telescope that NASA is using to identify extra-solar planets, and its been pretty damn good at it’s job so far. It has currently identified and confirmed 1,284 extra-solar planets. It also may have inadvertently identified an alien megastructure. In October of 2015, Keplar discoverd an odd intermittant signal around the star, KIC 8462852. Keplar identifies planets by plotting the dimming and brightening of stars as planets pace in front of them. However, the dimming discovered at KIC 8462952 is irregular and random. The problem is unsatisfactorily explainable by most known natural celestial bodies. There are still some possibilities, such a swarm of comets, but the discovery still has most experts asking questions rather than finding answers. Listen, we’re not saying it was aliens, but… Also, don’t strap on your flight suit and 1990’s aviator glasses just yet. The star in question is 1,500 light away from the Earth, which means the 8462852ians have a long way to go before they mind control our President and throw Mr. Data across a room.

Still, according to recent findings it is becoming incredibly more and more likely that aliens existed, at least at some point in the history of the universe. This comes from Astronomer Woodruff Sullivan, who not only won the Best Name in Astrophysics award, but published a paper recently, basically proclaiming that aliens existed… at some point. Don’t get too excited because he didn’t get visited during the night by little grey men with big eyes. No, he proves this all through math and with the help of the famous Drake Equation. This equation was first created by Dr. Frank Drake as a hypothetical way to determine the odds of extraterrestrial life in the universe. It takes into account things like the average rate of star formation, the number of planetary bodies around stars, the amount of planets that might be able to host life, etc. What Woodruff Sullivan basically claims, -Do his friends call him ‘Woody’ or ‘Sully’- is that a lot of these factors are actually becoming known to us through science. With the rate of extra-solar planetary discovery and our ever increasing knowledge and catalogs of stars and the rate at which they form, we are filling in a lot of the factors that Drake himself could only estimate, and the numbers are looking very much in favor of the existence of extraterrestrial life.

Life among the stars is an exciting and scary prospect, and that is kind of the point of all this. Humans have been wondering what might be out there since Giovanni Schiaparelli aimed his telescope at the red planet. The rest has been pure human imagination. You see, the existence of alien life, hostile or friendly, is as much about our own feelings and ideas as it is about any actual science involved. Much like Wells and Welles we project our own ambitions, fears, and motives on what we think alien invaders should be. In 1942 they were the Japanese, and in 1897 they were cattle rustlers, because those were things that we feared during those eras of our history. So aliens may exist in the constellation Sagittarius or it may be radioactive comets. Aliens may exist around KIC 8462852, or it may be a swarm of comets -come to think of it comets kind of explain a lot of things- but even if there is life out there we will probably never meet them. Our aliens are the ones we see in films that like to blow up national monuments, not because they are strategic targets or because of their military value, but simply because Roland Emmerich knows that stories about alien invasions are more about us than about them.

Image courtesy: http://www.huffingtonpost.com/jason-apuzzo/the-time-a-ufo-invaded-lo_b_6749734.html

May 25th is Geek Pride Day. Geeks and nerds have come a long way over the past sixty years since the term first entered into popular use in the 1950’s. The term nerd was coined by none other than Dr. Seuss in his book If I Ran a Zoo. If you don’t have time to read it, check out the Matt Damon film adaptation. In 1951 Newsweek reported that the term was used in Detroit Michigan to describe someone as “a drip,” or “a square.” Being a nerd in 1951 Detroit meant wedgies, and probably not being able to go to the sock-hop on Friday nights or something like. Fortunately, it is 2016 and the nerd population -along with its vibrancy and appeal- is soaring, while Detroit, ironically… not so much. Our point: Nerds Rule, but that wasn’t always the case.

Taking Back Nerd Day
A lot of people want to make distinctions between terms like “Geek,” “Nerd,” “Poindexter,” “Dweeb,” “Anime Fan,” but we here at The NYRD -get it- never really liked making those kinds of comparisons. Yes, each word can have separate meanings, but when taken together, what you get is a collection of the downtrodden, the kids who played instruments in the marching band, or who doodled math equations in gym class, or screwed up their first ever date because they couldn’t shut up about the intricacies of Star Fleet rank structure. We all have something in common, we didn’t quite fit in, like a rhombus shaped peg trying to squeeze into a Euclidean special ellipsis hole. Words like “Geek,” after all, used to literally mean freak, and maybe that’s why we have always preferred using “Nerd.” According to Wikipedia:

Nerd is a derogatory slang term for a person typically described as socially-impaired, obsessive, or overly intellectual. They may spend inordinate amounts of time on unpopular or obscure activities, pursuits, or interests, which are generally either highly technical, or relating to topics of fiction or fantasy, to the exclusion of more mainstream activities. Other nerdy qualities include physical awkwardness, introversion, quirkiness, and unattractiveness.

If you continue reading the article, it goes on to say that the term “nerd” has been re-appropriated by some as a term of pride and group identity, and we would argue that has gone a long way to changing the perception of our subculture. In the 1950’s people were ashamed to be called nerds, and now nerdy interests rule at the box office, on HBO, and on video game consoles across the world. Some might say that nerds have evolved, moving from the stereotypical glasses-wearing social outcast to Nathan Fillion, but that is not the full story. We have always been intelligent, passionate, and very cool individuals. The perception of nerds, however, has evolved.

It’s all Geek to Me
Potsie Weber was a nerd, and though Happy Days, was really a sitcom from 1974, it was meant as a reflection of the 1950’s. Warren “Potsie” Weber was the show’s nerd character, and often called so by others. He was socially awkward, gullible, and credited as not being too bright. He did his best to fit in with Richie and the Fonz, but it never quite worked out. He was often the butt of jokes, but in the end Potsie proved to be a talented musician and smart enough to eventually become a psychiatrist. He may not have wore glasses, but he was a nerd all the same. He was someone who found his his talents and interests undervalued, yet he constantly attempted to fit in.

By 1984, and the release of Revenge of the Nerds, the stereotype had been cemented. Glasses, suspenders, pocket-protectors and the works. Nerds were no longer just socially awkward people trying to fit in, they had become full-blow space cases. They were seen as weird and hopeless outsiders living in their own pimply world. It was a formula followed by such notables as Steve Urkel on Family Matters, Milhouse Van Houten on The Simpsons, and other nerdy, awkward, and annoying characters. -Thankfully, these later incarnations left out the “rapey-ness” of Revenge of the Nerds- Other notables of the 1980’s and 1990’s were Samuel “Screech” Powers from Saved By the Bell, and Carlton Banks from Fresh Prince of Bel-Air. Though they were not adorned with glasses and suspenders, each was a nerd, especially when compared with the popular main characters, and that was always the point. Nerds in the 80’s and 90’s no longer tried to fit in like Potsie did. Instead they lived in a world all their own that was strange and a subject of ridicule. They were the butt of every joke. Nerds were side-characters. They were comedic foils for much cooler leads, but then the 21st century arrived.

We do not want to mislead you. There are still plenty of socially clumsy, angry, and annoying nerd characters that arose in the 2000’s and 2010’s, such as The Office‘s Dwight Schrute, and any cast member of The Big Bang Theory, –very annoying– but at least we are no longer regulated to being side-jokes. Perhaps because of the success of characters like Urkel, or because actual nerds rose to power within the industry, Hollywood started recognizing the need for more mainstream nerd characters, and more complex ones. Characters like Charles Bartowski from Chuck, or Liz Lemon from 30 Rock, give us the more modern outlook on what a “nerd” has become. In essence, the modern take on the “geek” is a combination of the two older views. The world of nerds is still separate and strange, but no longer a subject of ridicule, and not our sole realm of existence. Nerds are still awkward and have odd obsessions but they are also characters with drive and ambition, as well as real problems and personalities. Hollywood -as well as the rest of the country- have finally realized that being a nerd is more than glasses and pocket protectors.

Nerdstream Media
If you need more proof of the mainstreaming of nerd culture you really need look no further than the article and the website you are currently reading, -We’re a pretty big deal- but if that isn’t enough to convince you, than just think of the movies everyone is watching this summer: Captain America, X-Men, Ghostbusters, Batman and Superman. Comic book movies, Star Wars, Star Trek, and Harry Potter. All these franchises have at least one movie coming out in the next year and all are expected to make quite a multi-million dollar splash. They are also all things that all have their roots in nerd culture. Granted, you will probably not see your high school quarterback rolling a D20 anytime soon, or the captain of the cheerleading squad shouting in Dothraki, but thanks -in part- to the internet nerds have never had it better.

The acceptance of nerd culture may have something to do with the general malaise and cynicism of modern times, a reaction to trends that started with grunge and emo music in the 90’s and early 2000’s. Yes, it can be “cool” to sometimes pretend to be disinterested and aloof, but it is also boring. Nerds on the other hand are the last people on the planet that are legitimately allowed to get spazzed-out excited over things. First of all, because we -by definition- are not cool, so we do not have to worry about appearing as such. Secondly, we just have so much cooler stuff to get excited over. When was the last time anyone dressed in costume to go see a George Clooney Indie Film debut, unless you count hipsters, but they are always in costume. Turn, instead, to the openings of Star Wars, or any Marvel movie, where you get a bevy of irrationally excited movie-goers dressed as everything from Jedi to wizards. Even attendees at Comic Con are no longer people who would usually claim themselves as “nerds.” Who would have ever thought that things like Game of Thrones, The Walking Dead, or even Star Trek would draw non-nerd fans -we call you normies– but it makes sense. After all, when you are faced with a choice between a world filled with Kardashians or a world filled with wizards we would like to think that wizards would win every time.

People just want to get excited over things. Fantasy and imagination are not traits limited to a sub-culture of the population. Everyone has dreams, and hopes, and the desire to see laser guns blow up spaceships, awesome sword fights, or even dragons. Being a nerd is really nothing more than remembering that at one point you were a child. Everyone threw a towel over their shoulders and called it a cape or swung a stick around and called it a lightsaber, but the only difference between a nerd and everyone else is that for us that time was last week. More to the point being a nerd means being smart,  passionate, and/or talented in ways that are not always recognized by the mainstream, and those are all good things. Being quirky makes life interesting, and have a like-minded group of friends -even if others think they are wierdos- is what makes life important. It’s about having fun, being yourself, and most importantly not taking life too seriously.

If you think Donald Trump is a megalomaniac with grandiose ambitions for Presidential power, than you would be right. However, he is not the first man in American history to start a political frenzy over the Presidency. You may only know the name Aaron Burr from an old “Got Milk” Commercial, or simply as the guy who shot Alexander Hamilton, but there was so much more to this complicated, brilliant, and ambitious Founding Father. Unlike Donald Trump who is usually content to write his name across whatever building he happens to own, Burr proved that he would not be satisfied till his name was written across the face of an entire country.

An Origin Story
Aaron Burr was born in the great metropolis of Newark, New Jersey in 1756. His father, Aaron Burr, Sr. was the second president of the College of New Jersey, or as you might know it these days Princeton University. His mother was the daughter of John Edwards, -no not the John Edwards that talks with ghosts- the famous theologian who was a key player in the First Great Awakening. Like most comic book protagonists, Burr found himself orphaned at the age of 2 after both his parents passed away. However, that did not stop him from getting admitted to the College of New Jersey at the age of 13 and graduating with a Bachelors of the Arts at 17. He moved to Connecticut to study law, but put that aside when fighting broke out at Lexington and Concord.

Aaron Burr tried to receive an officer’s commission in Washington’s Army, but in a trend that would continue for the rest of his life, George Washington turned him down. So instead, the 19 year old Burr enlisted with General Benedict Arnold, and his Canadian Campaign. He distinguished himself during the Battle of Quebec, and General Richard Montgomery promoted Burr to the rank of Captain. Eventually, he made his way to Manhattan where he earned a place in Washington’s Staff. During the retreat from Lower Manhattan to Harlem, it was Burr’s vigilance that saved an entire brigade of troops, including an officer named Alexander Hamilton. Despite everything though, Washington notably never put in a commendation for his bravery. By some accounts, Washington never trusted Burr. Maybe he saw the budding villainy of the man or maybe he just wasn’t very fond of Aaron Burr’s ferret-like face.

Despite the public slight by Washington he did eventually make Lieutenant Colonel and served with distinction until 1779 when declining health forced him to retire from the Continental Army. He returned to his studies of the law and was admitted to the New York bar in 1782. From there he married Theodosia Bartow Prevost, a widow of a British officer who was 10 years his senior, and moved to New York City after the British evacuated it. He had one daughter who survived into adulthood, also named Theodosia. Burr’s wife died in 1794 from stomach cancer. In his private practice the war hero was an accomplished lawyer that commanded substantial fees for his services. By all accounts he was very generous with spending that money on lavish clothing, fine furniture, and other symbols of status and wealth. So naturally, he entered politics.

An Arch Nemesis is Born
Alexander Hamilton was shot and killed in a duel with Aaron Burr on July 11, 1804, presumably after a long winded monologue about how, “With Hamilton out of the way, the world will finally kneel before Burr.” [citation needed] Yet, as famous as the duel itself has become it is only the end of the story. Burr first served in the New York Assembly before unseating General Philip Schuyler as the Senator from New York. Incidentally, Schuyler was Alexander Hamilton’s father-in-law, and there are some accounts that it was that election which drove the first wedge driven between the friendship of Hamilton and his would be assassin.

Yes, because like any good villain, Burr and his arch rival were first friends, or at least acquaintances. They were both from the New York area, and even though they were in opposing political parties they still had a lot in common. So it was only natural that Burr and Hamilton would have been friends, at least until Burr started making some shady deals. In 1799, Burr went to Hamilton and other New York Federalists to get their support for a badly needed water company for Manhattan, but after it was approved Burr changed the charter for the water company to a bank. The more astute of you may notice that a bank has nothing to do with supplying water to a city. Burr founded the Bank of the Manhattan Company, which was later absorbed by Chase Banking which is now part of JPMorgan Chase… you know, career super-villains. Worst of all the false water company scheme delayed the construction of an actual water company for Manhattan which was suffering from a Malaria epidemic at the time… New York problems, right?

Aaron Burr ran for President twice, first in 1796 and then again in 1800. Back then the Electoral College -the group of men that vote for the President- were hand picked by the State Assemblies. After he lost in 1796, Burr quit the Senate and went back to the New York Assembly. While back in Albany, he began to make himself a key player in New York politics, even converting the infamous Tammy Society from a social club into a political machine. So when it came time for the 1800 elections, Burr had already positioned himself as a political power-broker by not only having a hand in selecting New York’s electoral delegates, but also by controlling the political aspirations of New York politicians. One of the largest of the northern states, New York, was a key State to any one’s Presidential candidacy. It was basically what Florida or Texas are today, except with less malaria.

The Plot Thickens
Because of his political influence and his successful opposition to Hamilton and the Federalists, Thomas Jefferson knew that he needed the support of Aaron Burr to win the 1800 election. So the two men struck a deal that they would run together on the same “ticket.” The idea was that their new political party, the Democratic-Republicans, would make Jefferson President and Aaron Burr Vice-President, at least that was what Jefferson believed.

In 1800, the electoral delegates were tasked with casting two votes -instead of one as they do today- because the candidates with the most votes became President and the runner-up became Vice President. However, that leaves a lot of room for confusion. You see, there was no President vote and seperate Vice-President vote. All the votes were for the Presidency, and though it cannot be substantiated by historians, it seemed pretty obvious that Burr tried double-crossing old Tommy boy. With the power of the New York electorate and with political influence in many Northern states, Aaron Burr drummed enough support so that the election became a tie between himself and Jefferson. Each man got 73 votes. Even though most people understood that Jefferson was meant to be President and Burr Vice-President the tie still had to be decided by the Federalist-controlled House of Representatives. Most Federalists hated Jefferson. So the assumption was that the House would swing the vote toward Burr, and that is exactly what almost happened.

You need to understand that Hamilton and Jefferson were famously bitter rivals dating back to the Articles of Confederation. it was like Tom and Jerry, but Hamilton still threw his political influence behind Jefferson over Burr, convincing others to vote for Jefferson. Meanwhile Burr and William Van Ness tried vehemently to turn the election in their favor. It took 36 ballots but finally the tie was broken and Jefferson was elected President and Burr was made Vice President. After that fiasco, Jefferson -understandably- never fully trusted Burr again and kept him his as far away from the Presidential office as possible, presumably because he feared Burr might one day tie him to the railroad tracks -which didn’t exist yet. It was painfully clear that Jefferson would drop Burr as his Vice President during the 1804 election, so instead Aaron Burr tried running for the Governorship of New York. There he was embarrassingly defeated again because of Hamilton. This was the what l;ed to the duel.

‘Kneel Before Burr’
There are varying accounts of the duel and much like Han and Greedo, no one can seem to agree who shot first, or if Hamilton missed on purpose or was just a lousy shot. What is clear is that after Burr became the only sitting Vice President of the United States to kill a man -that we know of- he became wanted in New York and New Jersey. The duel was fought in Weehawken, NJ because laws were less stringent about shooting people in the Garden State. Once accused Burr fled to South Carolina, because back then murder charges did not follow you across state lines, but this was not the end of Burr’s villainy.

The accounts differ, but it seems clear that Burr went full super-villain by that time and tried to carve out a little empire for himself in the American midwest/Mexico. He enlisted the help of several prominent conspirators, including General James Wilkinson, the Commander-in-Chief of the US Army, and Andrew “freaking” Jackson. Jackson even allegedly congratulated Burr on “removing Hamilton from the political arena.” The future President and $20 bill mascot, along with the Army’s Commander-in-Chief pledged support and troops for a “military expedition” that Aaron Burr was planning. The particulars get a little fuzzy, buthe basically believed that war with Spain was inevitable and that the US Federal Government could not enforce its jurisdiction past the Appalachian Mountains. Thus, from all accounts it seemed as if the former Vice President had every intention of marching an Army into Spanish America and carving out a slice for himself.

Most notably, he expressed a belief that the Mexican people were not suited for democracy, and that it would be best if they were ruled by a king. After saying that he probably winked while pointing toward himself vigorously. Emperor Burr sent then envoys into Mexico to get a feeling for the people’s acceptance of Spanish rule and to whisper “Hail Hydra” to one another as they did it. Basically, Aaron Burr was trying to do exactly what Texas did thirty years later, except with more overtones of “King Aaron” thrown into the mix. There was even talk about taking Baton Rouge and New Orleans away from the United States.

‘Curse You, Jefferson’
Eventually, word of this got back to Thomas Jefferson who understandably issued a warrant for Burr’s arrest. James Wilkinson then got cold feet and would up turning on Burr. Jackson was similarly no where to be seen when the tides started turning, and Burr was easily arrested in March of 1807. He was brought to trial in front of Chief Justice John Marshall on charges of treason, but despite extreme pressure from the White House, Marshall ruled in favor of Burr, claiming that were was not sufficient enough evidence to convict him.

After being acquitted and flat broke Burr fled to Europe where he continued to try and drum up intentional support and backing for his American Empire idea. He even tried to get a meeting with Napoleon, but the French Emperor would not see him. His only legitimate daughter, Theodosia, then died in the winter 1812-1813 aboard the schooner Patriot. She was either shipwrecked or killed by pirates, which admittedly are pretty bad-ass ways to go, but Burr was devastated by the loss. He returned to the United States -having been acquitted of that pesky murder charge- and resumed practicing law. He married a rich widow, she divorced him for blowing her money on land speculation, and he died due to complications of a stroke in 1836.

And, that is the story of one of America’s most notorious Founding Father. So, just remember, whatever you think of this year’s election at least Donald Trump hasn’t tried invading Mexico… yet…

It is a story for the ages: You roll a 3, and instead of collecting $200 you land on Boardwalk, with two houses on it. Your friend smiles fiendishly as he tabulates your rent. Your broke so you flip the board in anger, just as your father and your grandfather did before you. Monopoly has been a mainstay on the shelves of Americans for generations, but the history of the game is filled with as much intrigue and infuriating rage as the game itself. So before your next foray into land ownership on that colorful board of Parker Brothers, take a ride in a small silver car down the Baltic Avenue of history.

Taking a Chance
Did you know that in World War II the British used Monopoly boards to smuggle maps and escape kits to their POWs trapped behind German lines? The Germans never questioned it or caught on, because even by the start of the war Monopoly had become known worldwide as the iconic mainstay of board games. According to Hasbro more than 250 millions copies of the game have been sold across the globe, with games in every major language. The gaming giant also estimated that nearly 500 million people have played Monopoly. However the origins of the famous game are not as ubiquitous as its distribution, or as ubiquitous as the feeling 500 million people have felt when failing to avoid Park Place for the third time in a row.

The established legend of Monopoly tells that Charles Darrow was the unemployed and nearly broke man who sold the game to the Parker Brothers in 1933 -also known as the Great Depression- for more money than it takes to buy hotels on Boardwalk. Like JK Rowling or JK Simmons it was the kind of rags to riches story that helped sell the product and give everyone that warm feeling, which was good because in the 1930’s most people couldn’t afford actual blankets. The story goes that Darrow would play the game with friends and one day had one of those friends write down the rules. Then, within months he found himself as rich as Mr. Monopoly himself, who was originally named Uncle Pennybags. However, Darrow’s story -though true- is also as deceptive as that time Todd tried to convince us that he rolled a 13.

The real inventor of the game was Elizabeth Magie, a progressive and brilliant woman who invented the game in 1903. Magie was unlike any other woman of her time. She did not marry until she was 44. She worked as a stenographer and a secretary in the dead letter office in DC. On the side she wrote poetry, short stories, and performed comedic routines onstage. She created Monopoly -originally called the Landlord’s Game- at the turn of the century as a way to educate people about the dangers of monopolies like those held by Rockefeller and Carnegie. She received $500 for the game that she patented, but was largely forgotten in the history and legend of the Darrow story.

Passing Down and Passing Go
The Landlord’s Game was also, not exactly like Monopoly. Magi created the game to show the evils of monopolies and excessive greed, because that was how she rolled. Originally, players could buy property before the game began. They did not have to land on the property first to purchase it. Also, there was a second set of rules called the anti-monopolists rules, where players paid their rent to a communal pot. Essentially, the Landlord’s Game was created to promote a very socialist message. It was meant to show that monopolies are terrible. In fact, the “Go” space used to be labeled, the “Labor Upon Mother Earth Produces Wages” space, which granted seems a little heavy-handed for a board game that 8-year olds play. So what happened?

Viral marketing happened, well as much as it could happen in a time before the telegraph. The Landlord’s Game circulated among the country and with each new person or group of friends the rules often changed, just a slight bit. You know how when you play Monopoly with friends who did not grow up in the same household as you, and they have a strange rule for how to handle “Free Parking” or you get into that argument of whether you get $500 or $200 if you end your turn on “Go”? That is still a remnant of the original idea that the Landlord’s Game was meant to be changed by each new player in each new city. It even mentions this idea in Magi’s original patent. The rules were not set in stone, which is part of the reason why Darrow and the Parker Brothers were later able to make Monopoly their personal “Community Chest.”

Some of the most notable changes came from the Quaker communities of New Jersey and Pennsylvania, most notably Atlantic City. For instance, the original game had spaces that were named after streets in New York City, with the most expensive being, Broadway, Fifth Avenue, and Wall Street. The Quakers took this idea and changed the names to places around Atlantic City. They also put the prices for the property on the board so “Good Quaker Children” would not have to yell or haggle over prices. Originally spaces were auctioned and we suppose that got a little raucous for mommy and daddy Quakers. They also, changed the game pieces. Instead of using standardized colored markers they used little trinkets that they had around the house, hair pins, tie clips, thimbles, and presumably the household dog. This Quaker version of the game is the one that eventually made its way to Charles Darrow and was the forerunner of the game we know today.

Luxury Tax
Perhaps the greatest irony of the game was not that time when Todd had to mortgage his hotels because he trusted the strategy of “how often am I going to land on those railroads anyway.” No the greatest irony is that the game of Monopoly -or the Landlord’s Game- was created to educated the 99% on the evils of capitalism, where the modern game seems to be doing just opposite. It became more Donald Trump than Bernie Sanders. Darrow and the Parker Brothers made the game more fit for a world that believed they could have it all, every piece, every house, and hotel. In a way Monopoly became more about the American belief that we were all just one “Go” space away from hitting it rich.

These days Elizabeth Magi would probably be appalled at the state of her game. Not only does it no longer teacher an anti-monopolist message, but it has become one of the biggest and most recognizable icons of capitalism, associated in almost every way with the thing she was trying to educate people against. Like Coca Cola or McDonalds it has become this quintessentially American brand, the monopoly of board games. In fact, you can even play Monopoly at McDonalds, once a year by scratching off tickets to win a free piece of processed meat stuffed between two vaguely bread-like objects, all smothered in questionable sauces. You can play it on your computer or even your iPhone. Greed has become the name of the game, almost literally.

The last mention of Magi, was on the 1940’s Census, where her occupation was listed as “Maker of Games,” and her income was listed as $0.00. Charles Darrow died in 1967. Atlantic City placed a commemorative plaque on The Boardwalk in his honor. As for the game, it has come a long way from its humble beginnings. Now it is more of a brand than ever. Do you like Star Wars, Lord of the Rings, or the NBA? How about Bass Fishing, Sun Maid Raisins, QVC, Blackberry Phones, or even a small British town named Swindon, because those are all Monopoly editions that exist. But hey, isn’t that lesson that Monopoly teaches us all? You wheel and deal until you make it rich or you flip the board and storm out of the room.

It is that time of the year again, actually it is that time of the “every four years” again. Because this year is a leap year, a year when everybody wakes up on March 1st only to remember that it is actually February 29th. So why is this a thing? Well the story of our calendar is one of intrigue and murder… Okay, maybe not murder, but there were Romans involved so we’re thinking at least one killing and probably a few orgies, but that’s not what we want to talk about today. Instead, let’s take a closer look at this thing we call a leap year. So step with us into the Quantum Leap Accelerator and vanish… Oh boy.

Leap Back
For almost the entirety of human history, we humans have been obsessed with keeping track of the year. To understand this obsession we need to leap back, and unlike Dr. Samuel Beckett, we need to go a little farther back than any one of our single lifetimes, or even that of our parents or great grandparents. Our leap takes us back to the dawn of human history when our ancestors needed to keep track of the seasons in order to survive. Winter meant cold, Summer meant hot, and Spring meant it was time for… well you know. As early as 9000 BCE, humans were using notches on wood and bones to keep track of lunar phases in order to correctly count out the year. It became even more important to keep track of these things when we moved from a hunter gatherer species to an agrarian one. We needed way to know when to plant and when to harvest, and that is when things got trickier.

Keeping track of the seasons by how many full moons you see is fine, but imprecise. Even the concept of a “day” is hard to measure as the amount of sunlight and darkness vary from place to place and day to day. It took ancient humans awhile to figure out that you need to calculate the length of a day from high noon to high noon. That is why ancient calendars often varied from region to region. As you might imagine winters are a lot longer in Siberia than they are in Greece. So what does all this have to do with a leap year? Well, according to Ziggy, the ancient Egyptians were among the first to calculate the 365-day length of a year, and among the first to realize that we needed a leap year in order to keep us on track.

You see the solar year or tropical year is actually 365.2422 days long. As you can imagine it can be hard to account for that extra -almost- a quarter of a day. Some ancient societies like ancient Rome and China originally adapted lunar calendars, which meant that each month was 29.5 days long, but that meant the full year came up 11 days short. Other civilizations, like the Sumerians just divided their calendar into 12 months of 30 days and were done with it. That was problematic too, because if you are good at math you might notice that only amounts to 360 days and even Al can tell you that is about a week short. Now, our ancestors were aware of this and some civilizations often declared week-long holiday festivals or other extra-calendar activities to try and keep the year on track, but it was sometimes messy, and we’re not talking about the feasts themselves.

Leaping on a String
By the time Julius Caesar came to power Rome’s calendar was off by about 4 months. September was in summer, February was in the fall, and so the Romans found themselves leaping from year to year, striving to put right what once went wrong, and hoping each time that their next year would be the leap home… or something like that. Thankfully, Caesar was walking like an Egyptian with Cleopatra and observed their 365-day long calendar, and realized its potential to fix Rome’s own quantum-related problems. First, he had to fix the time lag, so in 46 BCE he decreed the Year of Confusion, a 445-day long year meant to get the Roman calendar back on track. He then changed the calendar to a 365-day calendar -conveniently giving himself a month in the process- and every four years he declared it to be a leap year to account for that discrepancy of almost a quarter of the day.

Now we say “almost,” because if you remember it is only .2422 of a full day. So adding a full day ever 4 years actually adds too much time onto the calendar. That’s why 128 years later, the Romans and everyone else who were living by the Julian Calendar found themselves off the solar calendar by an extra day’s worth of time. Leaping forward to the 16th century this discrepancy had caused important Christian holidays to slide forward by ten days or so, and Pope Gregory XIII decided he wasn’t going to be having anymore of that. So in 1582 he unveiled his Gregorian Calendar. First he cut the month of October short by 10 days, to fix the immediate problem, because screw October. Then the Pope decreed that every 100 years would not be a leap year. So there was no leap year on the years of 1700, 1800, 1900, but we did have one in 2000. Here is where it gets complicated, because 100 year intervals that are divisible by 400 -we are not kidding- do not skip their leap year. If you think that fixed the problem completely, then hold onto your Pope hat because we need to leap again.

This time we are leaping to the future. The Gregorian Calendar isn’t perfect. Factor in that the Earth’s rotation is actually slowing down, which is part of the reason why we arbitrarily add leap seconds to the clock, and you get a system of telling time is only ever going to be “good enough.” In the future, our ancestors may choose to change the calendar and the leap year tradition, because in about 10,000 year the remaining discrepancies will start to show through again, but who knows. Maybe by that time we will need to find a new solar calendar that accounts for the orbits and rotations of many worlds and moon colonies, or we may all be dead. For now the Gregorian Calendar is the best we have. After all, the calendar year is merely a human construct meant to try and keep track of something that does not work by our clocks or calendars, and yet that fact has not stopped people from letting their imaginations run wild with the possibilities of the leap year.

The Leap Home
There is a lot superstition and frustrations that surround a leap year. For instance, being born on February 29th in a leap year is confusing. You either celebrate your birthday every four years or you have to do it on a calendar date that isn’t your actual date of birth, and other traditions have taken the day further. February 29th is sometimes associated with the day that women propose marriage to men, because for most of history women taking charge was crazy talk. Other people believed that the leap year ruins the natural cycle of things and such superstitions arose as the Scottish saying, “Leap year was ne’er a good sheep year.” Greeks thought that making contracts or getting married on a leap year doomed the union to failure, which may explain some of their current relationship problems with the rest of Europe. Some notable things about February 29th is that Superman was fictionally born on that day, and Hatti McDaniel famously accepted the first Oscar awarded to an African American. Also, in 1504 Christopher Columbus used a lunar eclipse on February 29th to scare a population of local natives into giving his men supplies and food. So, you know, its a mixed bag sort of day.

However, there is possibly no other day that lives in so much infamy and awe in our collective imaginations. A leap year is not something we see everyday, but regardless of our superstitions or superhero birthdays, we need February 29th. Without a leap year we would still all be trapped in the past facing mirror images of our seasons that are not our own. We have come a long way in our calendar and who knows what the future holds. We may never be fully rid of our own little quantum leap.

With the return of The X-Files we at The NYRD have had some strange notions lately. Maybe the world is run by secret Illuminati bent on controlling our lives and dictating the fashion trends of our tinfoil hats, or then again maybe we are just succumbing to the very human need to see conspiracies where none exist. Either way we decided that it was time to delve into one of the biggest tinfoil hat theories in American history. It famously involves the assassination of a well know and beloved President of the United States who served during a time of crisis… You’ve guessed it, we’re talking about Zachary Taylor.

The First Presidential Assassination?
Zachary Taylor was only seventeen months into his first term as President when he died of what doctors concluded was stomach-related illness, but because we are humans and conspiracy theories are not a modern invention, there were all sorts of rumors that Taylor had been poisoned by arsenic, as his symptoms were very similiar to arsenic poisoning. So who would kill the President? Why might he have died? Wait, are we you talking about that kid on Home Improvement? We know you are asking yourself these questions and more. So in order to tackle your Tim Allen and non-Tim Allen related inquires we believe it is best to start at the beginning.

Taylor was an army man. He fought in the War of 1812, but won his fame during the Mexican-American War. He became a national hero, known for being a great leader and an inspiring man. The Whig Party eventually persuaded Taylor to capitalize on his popularity and run for President in 1848. By all appearances Zachary Taylor had no interest in politics or being President. He won because of his popularity as a war hero, but he spent most of his time ignoring Congress and avoiding his cabinet, like Todd when he refuses to put his dishes away after lunch. In a time when the United States was becoming increasingly polarized over the question of slavery, the lack of vocal and political support for slavery from the President -who himself was a slave owning Southerner- became increasingly frustrating for the South, maybe even enough to commit… dum dum dum… Murder.

Who Done it?
To further compound the problem for Southern slave owners when Taylor found motivation to be Presidential his policies seemed to favor abolitionists and anti-slavery proponents,  more and more. You see, Zachary Taylor did not support the idea of allowing the right to own slaves to expand into the western territories, a move which would eventually begin to erode the power of the pro-slavery voting block in Congress and the Senate. He also died a few weeks before he was set to veto several pro-slavery bills that had been presented to him by Congress. To add further credence to the conspiracy theory, Taylor’s death opened the door for Vice-President Millard Fillmore, a pro-slavery figure, who very promptly passed the Compromise of 1850 as one of his first acts in office. So is this all coincidence, or something more sinister?

One of the problems is that pinpointing who committed such crime has its own difficulties. Remember back in those days there was no such thing as the Secret Service or even the FBI. The White House was a government building and Mrs. Taylor often reported that she would find strange people wandering the halls and even in her own bedroom. The President of the United States was very accessible, as was the food he ate. Taylor contracted cholera morbus, a 19th century term that commonly meant, “We have no idea what killed him.” His sickness coincided with a long day of celebration and public meetings on the 4th of July in 1850, and the list of people he had contact that day with was a long one. Yet, that did not stop people from speculating, loudly and publically.

At the time of Zachary Taylor’s death, the Baltimore Sun really got into the conspiracy vibe of things. They were like the CNN of their day. In all fairness, the newspaper was not the only one putting forth the poisoning theory, but they did go so far as to name names. Prominently, The Sun suggested that the death could have been the work of two men, Robert Toombs and Alexander Hamilton Stephens. Both men were Georgia Congressmen and were called “Southern Ultrists,” at the time. It was reported that they threatened to vote for Taylor’s “censure,” if he did not support the South. In all fairness to the Sun, -they didn’t know it at the time- Toombs would eventually become the Confederacy’s Secretary of State and Stephens its Vice-President. So in retrospect, they may have been pretty “ultrist,” but the real question is, were they extreme enough to censure Taylor… dum dum dum… permanently?

The Verdict
Enter Clara Rising, a retired University of Florida humanities professor and author, she became interested in Zachary Taylor and his death in the mid-1980’s. In 1991 she even went so far as to convince Taylor’s descendants and the US Government to exhume the dead body of the twelfth President so it could be subjected to modern -well 1990’s- laboratory testing. Various dental, bone, and hair samples, were sent to three different facilities for that purpose. The final result came back that Zachary Taylor died of… drum roll… a “myriad of natural diseases which would have produced the symptoms of gastroenteritis.” It was ruled a death by natural causes, despite the fact that the remains yielded trace amounts of arsenic, but the medical examiner felt assured that those were too low to cause the death. Even more reassuring he explained how apparently all human beings have between 0.2 to 0.6 ppm of arsenic in their system at any given time… so, yay?

Many people in Taylor’s day were convinced that the popular, reluctant, and possibly anti-slavery President was murdered in order to pave way for a Southern power grab. Republicans, especially, subscribed to the idea as they were the party of most Northern abolitionist, which also shows you how much things have really changed since the days of Zachary Taylor. Many believed that there would be other targets, like Andrew Jackson, James Buchanan, and William Henry Harrison. Yet, in perhaps the greatest irony of this whole story, when Abraham Lincoln first took office he received hundreds of letters warning him that he was going to be assassinated, just like Taylor. Of course those letters warned against eating suspicious food and nothing about celebrities wielding guns.

So the next time a friend espouses a rumor to you about 9/11 or tells you how Kennedy was killed by hitmen hired by his own dog, Pushinka -She was Russian after all- just smile and nod and know that conspiracy theories have existed as long as humans. Maybe, in a way, the paranoia of people in the mid 1800’s justified. The Civil War was only years away and tensions were growing in all parts of the country. To many assassination may not have seemed like a big leap, and in some ways it is more comforting to believe that coincidence and bad luck are the result of secretive and powerful sources -whether they be big business, the government, or the Southern aristocracy- rather than just random chance. Conspiracies are a way we humans try to claim some agency over our chaotic world. The truth is often a lot scarier, sometimes bad things just happen. That is true whether it be car accidents, a deranged man with a gun, or just a bad bowl of cherries on a hot 4th of July in 1850.

Second Amendment

There is a little document that a lot of Americans really enjoy fighting over, and for once we’re not talking about the draft sheet for your fantasy football league. The United States Bill of Rights were the first ten amendments added onto the US Constitution after its ratification, and much like the Bible or a Quentin Tarantino movie people try to use it as justification for doing almost anything. Now, like all good Americans we have all 10 amendments memorized -okay maybe only like 4 of them- but we all have our favorites. For example, we know that Todd particularly enjoys the Third Amendment, because every year during the Memorial Day parade, when members of the military band ask if they can use our bathroom, he screams “stop violating my civil rights,” and slams the door. Others out there may enjoy the First Amendment or the Sixth Amendment, however, most people these days are doing a lot of talking about the Second Amendment. So we thought it might be good to get a little historical context on what the Second Amendment was and how it has shaped the national debate currently going on around us.

Our Forefathers Can Beat-Up Your Forefathers
The full text of the Second Amendment reads, A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed. Odd capitalization aside, we often find ourselves discussing the second part of that sentence but forgetting the first part. The ambiguity of the sentence has led to more than a few arguments. It is just another thing we can blame on our Founding Fathers, because the argument we are having today still echoes the argument they had more than 200 years ago.

Before the Constitution was ever ratified the men who created our nation found themselves divided into two camps, Federalists and Anti-Federalists. Federalists essentially favored the Constitution and a stronger central government while Anti-Federalists favored stronger rights for the States. Sufficed to say, the Federalists won in the end, but not without a few compromises, and the biggest contention was over the right for the new US Government to raise a standing army. Federalists argued that a standing regular army was needed to protect the interests of the nation. The Anti-Federalists believed that a standing army, loyal to the government, was the first step toward tyranny. They resolved the debate by granting Congress the ability to raise an army, but could only fund it two years at a time. However, the second and more crucial safeguard against the oppression of a Federal army was the idea of militias.

Local militias were something the colonists were very familiar with. Colonial militias had existed for years and had fought with mixed success in the American Revolution, but State and local militias in colonial times were a lot more than just what the National Guard is today. They also served as the nations first paid police force. Aside from elected Sheriffs, militia men were tasked with bringing dangerous criminals to justice. So when the Bill of Rights was finally written in 1789 one of the first amendments passed by the House and Senate was for the establishment of State militias as a check against the existence of the Federal army and as a lawful body to help keep local peace. That makes sense, because at the time our Founding Fathers were more preoccupied with States rights versus Federal rights rather than if people could own guns.

The NaRrAtion of the Law
Even the original wording of James Madison, who wrote the Bill of Rights, seems to be more focused on the military aspect rather than a private citizen’s “right to bear arms.” Before it was altered by the Senate the amendment originally read, A well regulated Militia, composed of the body of the people, being the best security of a free state, the right of the people to keep and bear arms shall not be infringed; but no person religiously scrupulous shall be compelled to bear arms. Madison even included a clause for conscientious objectors, which again points to the fact that the amendment was more focused on the military aspect rather than gun ownership. However, we feel compelled to point out that the Founding Fathers may not have specifically pointed to the “right to bear arms,” but there is also evidence that in their day it was considered a natural and normal right and they may not have felt the need to codify it more than they already did. Thus, the “right to bear arms” part cannot be completely negated.

Going forward there became two narratives concerning the Second Amendment. Up until about the 1970s and 1980s, the narrative of “collective rights,” or “states rights” dominated the political and judicial thinking of the Second Amendment. This narrative argued that the amendment only protects gun ownership of the States, and not individual private citizens. Basically, it argues that the Second Amendment is meant to be interpreted as the Constitutional right of each State to establish National Guards that are controlled and armed by State officials. Fifty years ago, no one was having a debate about the right to gun ownership. Then in the late 20th century the narrative suddenly shifted to favor what is known as the “standard model,” which argued that the amendment was meant to be dominated by the second part of the sentence, in that it really grants individual citizens absolute rights to own and keep arms. This narrative became popular around 1977, when a little known organization called the National Rifle Association went from being a group that promoted gun safety to a group that suddenly began to lobby for gun ownership.

It is worth noting that even when the NRA started proclaiming that the Second Amendment was about the “rights to bear arms,” the conservative Supreme Court Chief Justice at the time, Warren R. Burger, openly mocked the idea as “one of the greatest pieces of fraud on the American people.” He thought it was a laughable interpretation. Yet, the NRA kept pushing, and their new narrative was bolstered by the election of Ronald Reagan, a pro-gun rights President, and by the gun manufacturers themselves who gave large sums to make sure that the people in Congress got behind it too. Still it was not until 2001, in the Fifth Circuit Appeals Court, in the case of The United States vs. Emerson, that any judge even voiced acceptance of the the idea that the Second Amendment protected the rights of individual gun owners. Even then, the opinion was not legally backed until 2008 in the case of The District of Columbia vs. Heller, when Antonin Scalia ruled that the government did not have a right to infringe on the ownership of handguns.

An Infallible Right
In 2011, gun manufactures made 4.3 billion dollars, thanks in no small part to the new interpretation that the American public had come to accept about the Second Amendment. Suddenly, it was American to own a gun and un-American to want to regulate guns, and they had a vested interest in keeping it that way. Yet, even during the entire period when the majority of Americans accepted the idea that the Second Amendment was about regulating militias, gun ownership was not illegal, but by changing the dialogue and making gun ownership a right -on par with free speech and religion- gun ownership went from “not being illegal” to “protected by the law” and those are two very different things. Gun ownership suddenly became so sacred as to be untouchable, but we feel compelled to point out that no other right granted by the constitution enjoys such unfettered legality.

George Washington famously said, “Individuals entering into society, must give up a share of liberty to preserve the rest.” In other words, yes we have rights as citizens but we have to understand that when those “rights” interfere with the lives and rights of other citizens than we have to recognize the need for moderation. Thus, it is illegal to yell “fire” in a crowded theater, or to say “bomb” on a plane when there isn’t one, because those are not examples of free speech. They only serve to put others at risk. We have laws limiting or mitigating the effects of almost every amendment in the Bill of Rights, so why is it suddenly so unfathomable to have laws limiting gun ownership, regardless whether the Second Amendment was meant to refer to that specific right or not.

A lot of the problem goes back to the way the amendment was worded. People who claim it refers to the individual gun ownership model argue that the first part of the sentence, A well regulated Militia, being necessary to the security of a free State, is meant as a justifying preamble to the second half, the right of the people to keep and bear Arms, shall not be infringed. And that could be true, but it is worth mentioning that no other amendment in the Bill of Rights starts with a justifying preamble. Everything else just states what it means to say without beating around the Constitutional bush. Still, it is worth remembering that the words are in there, and we know that the Founding Fathers did see gun ownership as a natural part of life. Yet, to them guns were single fire muskets that required a full minute of reload time, and that is also worth remembering.

The Right to Bear History
Times change, opinions change, and laws have to change to change with them. It is ironic how worried our Founding Fathers were about the existence of a standing United States Army, and yet there is not a single person today who still argues if the USA should have a standing army. Even more ironic Federalists like Washington, Adams, and Hamilton did not want a Bill of Rights at all, believing that the Constitution was enough to guarantee the freedoms of the people. They believed that codifying what they saw as the natural rights of man would ultimately make those rights “colorable” and open to be misinterpreted and used for demagoguery, kind of like exactly what is happening today with the Second Amendment.

Lastly, our Founding Fathers were men, who fought and argued, and bickered same as we do today. They compromised and struggled. They were not divine beings who granted us a document from the almighty. They were not perfect, and you do not need to look any further for proof than in their Three-Fifths Compromise. They also could not predict a future of drones, tanks, or assault weapons, and that is why they made the Constitution a living a document, one that could change with the times and be amended. They knew that future generations would face new problems and need to find new solutions. So, regardless, of how they, or the NRA, or you, or this website chooses to interpret the Second Amendment, it is also worth remembering that all those famous historical founders that stare at us from the fronts of money, entrusted us to make laws and interpret them to fit today’s world, and not the world of the single shot musket.

Donald Trump is a cartoon character, but it is sometimes hard to remember that with all the Animaniacs-esque craziness that goes on with the United States’ primaries that we didn’t always nominate our leaders this way. The modern primary system did not fully form until 1972, which means that Bugs Bunny is older than our current primary election cycle. In fact, our modern election system only beats out being older than Disco by roughly two years. That is not to say that primaries and the National Convention system did not exist before Porky’s speech impediment, -they did- but like with most things in history the story of our nomination system is neither straightforward nor any less looney than a cross-dressing rabbit.

A Party to History
As with most things in American history we can start by laying the blame on George Washington, mostly because he won election to the highest office faster than Speedy Gonzalez running toward -what we can only assume is- some sort of highly racist taco stand. It was only after Washington selfishly refused to be our dictator in chief by turning down his third term that our election process became a big game of “Duck Season/Rabbit Season.” John Adams and Thomas Jefferson -history’s greatest frenemies- were next up for the position and that was when things got interesting.

That election basically kicked off the two-party system that we all know and love today. Jefferson was a Democratic-Republican, and Adams was a Federalist. Their respective parties nominated them through a Congressional Caucus, which basically meant everyone in Congress picked who they liked best to go up for election. It is kind of like how American high schools today pick their prom king and queen. Jefferson and Adams were basically selected by their respective parties because they were Jefferson and Adams. -Being a Founding Father goes a long way on a resume- The election was won by Adams with Jefferson as runner-up, which back in those days meant he got to be the Vice-President. It is also worth mentioning that the campaign got fairly heated with the Federalists at one point trying to link Jefferson with the violence of the French Revolution. So, if you think that the hyperbole and outright lies of today’s elections are a modern addition to our electoral process you can happily dissuade yourself of that notion.

Presidential Primaries were conducted in Congress until 1832. After that increasing social pressure created the beginnings of the National Convention system that we have today. More and more the common people wanted a hand in picking their party’s candidate for President. Don’t be fooled though, because the National Convention system was neither fair nor binding. Holding nominations at a National Convention gave tremendous power to state party bosses. Basically each state controlled their primary electors and if those electors did not vote the way the party boss wanted them to, they could lose their job. Thus, all the convention system did was move the power of nomination from Congress to a select few powerful state-level figures. That basically meant nominations for party candidates were literally made in cartoonishly smoke-filled backrooms.

Roosevelt Gets Bully
This was the way the system would probably work today if it didn’t eventually run directly opposite to one of America’s biggest, brashest, and widest-grinningest Presidents to ever shoot down a Kodiak bear from the Oval Office’s windows, Theodore “Iron Gut” Roosevelt. In 1912 Old Teddy decided to launch a comeback against his successor William Howard Taft. Previously in 1901, Florida -because of course- was one of the first states to pass a law that called for a Presidential nomination preference. Florida and the states that followed basically said that whatever candidate that the majority of state level members of political parties voted for were the candidates that those states’ delegates had to elect as the nominee at the National Convention. Unfortunately, by 1912 most states had not yet started holding primaries and though Roosevelt won more primaries and delegates than Taft the nomination still went to the incumbent President. However this did highlight the importance of presidential primary laws.

In fact Woodrow Wilson -who beat Taft- called for a national primary law in 1913. Unfortunately, much like a Wile E. Coyote plan, this looked better on paper than in practice. Despite the fact that most states eventually adopted primary election laws on their books, not many states actually held primaries, mostly due to the cost associated with holding them. Also, many laws were barely binding and  state-level and national-level political bosses still continued to ignore results and nominate who they wished regardless of primary election votes. It was common for many “serious” candidates to only enter into just one or two primaries in the country to “gauge” their popularity, but not because it helped them get nominated. In fact, there was even an odd loophole where state-level political figures like Governors could enter their own state’s primary and get elected as that state’s nominee, just so they could go to the National Convention and become one of the power players that got to decide who the party’s next nomination actually went to.

In 1952 Democratic Senator Estes Kefauver won 64% of the votes cast in the -then- 16 states that held primaries and still lost the Democratic nomination to Adlai Stevenson. Stevenson, on the other hand, won less than 2%of all primary votes. In all fairness though, whoever won the nomination was destined to be torn to shreds in the Tasmanian Devil-like whirlwind that was the “I Like Ike” tornado. As a side note, that was also the first presidential election to start using TV advertisements, though they were a little different from the ones we know today.

The System Get Humphrey-ed
Everything came to a head at the 1968 Democratic National Convention in Chicago. To say that the convention turned into a war-zone may be a bit over an overstatement, but only slightly, and just because making such a statement would mean that you would have to draw a direct comparison with the actual war-zone that was taking place on the other side of the world in Vietnam. Basically, tensions were running high. Chicago at the time was probably not the best place to hold the convention. The weather was sweltering, the cab driver union was on strike, the entire city was on edge, and the front entrances had to be bullet-proofed for fear of violence. Police, secret service, and the National Guard were all on standby and the convention center was ringed with barbed wire fencing. It di not help that Martin Luther King Jr. and Robert Kennedy had both been assassinated within the past 4 months, and more than 100 cities were were suffering from race-related and anti-war rioting.

It was in this atmosphere that Hubert H. Humphrey, Lyndon Johnson’s Vice-President, beat out Senator Eugene McGovern for the nomination. The problem with this nomination was two-fold. Humphrey had chosen to sit out the primaries and thus had received almost no primary votes, and secondly he was a proponent of LBJ’s Vietnam War agenda. He was literally the pro-war candidate. McGovern, on the other hand, was a strong populace candidate that did well in the primaries and was an actively anti-war candidate. When Humphrey won it seemed like a betrayal of the democratic process, and the trust of the people.

Meanwhile, the convention center was surrounded by protesters, everyone from hippies to civil rights activists to middle-income Americans. They were all looking for change in a world that must have felt like it was falling apart around them. 10,000 demonstrators were met by 23,000 police and National Guardsmen. Security was on such high alert that at one point Dan Rather got roughed up by police while trying to interview a Georgia Senator. Violence was inevitable, and on August 28 hundreds were seriously injured in a massive riot; and not just protesters and police, but news reporters, political volunteers, legislative aides, and countless bystanders who got caught up in the mayhem. After the convention there was a massive outcry for a change to the primary electoral system.

Here He Comes, Mr. America
Starting in the 1972 election cycle the states and both parties enacted the reforms of McGovern-Fraser, a commission on primary election reform. The rules made primaries easier to participate in and did away with rules like “winner-takes-all” delegates. The change essentially made primary elections the established way to pick nominees for President, and it is the system we still use today. Unfortunately, it has also led to the two-year long beauty pageant that are the modern primary elections.

The way the current system functions means that candidates have to announce their intentions of running years in advance and start securing delegates in places like Iowa and New Hampshire. In that past, candidates had the luxury of waiting to announce their candidacy, even up to less than a year prior to the actual Presidential Election. In fact in the days of Lincoln it was considered immodest to campaign for a nomination at all, or even the Presidency. William McKinley literally stayed home during his Presidential race, and gave speeches from his front porch. The new system also gives greater weight to certain states over others, which is why you constantly have states trying to move their primary elections ahead of others to gain greater attention from candidates.

Regardless of whether you agree with the old system or the new you have to admit that our nominating process has never been perfect. In fact, looking back it has always seemed more like the plot of a Warner Bros. cartoon than any actual serious political discourse, but it is the best we have so far. The fun thing about the primary election cycle is that it is not in the Constitution. Those original framers never envisioned this, and thus it keeps changing to match the times. So in fifty years there will be no telling what sort of new provisions may come about. Who knows maybe one day we’ll be seeing the Presidential Physical Challenge.

Th-th-th-that’s all folks…

Once upon a midday dream, while we pondered Halloween,
Over many a quaint and curious website of digital lore,
While we searched, nearly napping, on the keyboard, always tapping,
And then some one gently rapping, dropping beats and rhymes galore.
“‘Stop this poetic bore,” twas muttered, “and write your journalist’s chore-
Only prose, and nothing more!”

Edgar Allan Poe’s classic, The Raven, is the right kind of story to set the mood for the coming holiday, especially if read by Homer Simpson. -It is unfortunate that our boss threw a flaming jack-o-lantern through the idea of doing an entire article in that style- However, old Edgar was not the first nor the last to give us the creeps, because being scared is part of being human. It gets our adrenaline pumping and helps us feel alive. Heck we have a whole holiday dedicated to it, but the origins of Halloween are not as straightforward or even as scary as you might think.

A Nightmare Before Christ
Most historians seem to agree that the origins of modern day Halloween can be traced back to the festival of Samhain, pronounced “sah-win”. This pre-Christian pagan ritual took place on November 1st in Celtic tribes and communities. Literally translated, the Gaelic word means: “Summer’s end.” The full traditions and practices of the festival are not fully detailed in any written historic records, but we do know a few things about the ancient autumnal holiday. It was communal, and it was a time when the Celtic people gathered to commemorate the end of summer and -like Ned Stark- prepare for the coming winter. The ancient Irish and Scottish literally celebrated it like summer’s funeral.

To them winter was a time when the land was dead. Samhain was the beginning of that death. So to the ancient Celts the night before, October 31st, was a time when the veil between life and death was at its thinnest, as the world transitioned from one state to the next. It was believed that during the night ghosts and spirits would walk the world. The people left out offerings for those spirits on their doorsteps. If anyone stepped outside their door they had to go masked, disguised a ghost so that none of the real ghosts would recognize them or make fun of them, presumably. Samhain also celebrated by bonfires and other activities.

The Catholic church at the time was always in the game of supplanting pagan holidays with their own -which is why Christmas takes place in December and not the spring. So under the direction of Pope Gregory III the church declared that November 1st was All Saints Day, or All Hallows Day. That meant October 31st became All Saints Eve, or more popularly known as All Hallows Eve. Because the human tongue is lazier than Garfield on a Monday, over time we shortened the name -like OMG, what do you mean? WTF. So the festival became known as Hallowe’en. The holiday was a hit throughout all of England and Ireland, but it would take a while to make its way to the New World.

Frankenstein’s Holiday
The original colonies were founded by the stoically overworked Puritans, who weren’t really into all this pagan nonsense about ghosts and spirits. Yet, as more and more people came over to the colonies the holiday become celebrated sporadically, but only through plays, dancing, or fortune telling. It wasn’t until the late 19th century when the potato famine drove thousands of Irish immigrants to the shores of the United States that the holiday really began to take hold in American culture.

The Irish, longing for the traditions of home, celebrated Halloween as a way of reconnecting with their Celtic roots. Traditions became modified in the melting pot of America and changed for practicality sake. For example, a lot of Halloween symbols we know today, such as spiders, black cats, and bats came from American ideas about witches and pagans. The Celtic bonfires of old became contained to single candles within pumpkins. In fact, the carving of jack-o-lanterns also changed. In Ireland people carved potatoes or turnips. Pumpkins don’t exist in the British Isles, but thanks to the Pilgrims they are the squash of choice for the American fall season -just ask any barista at Starbucks.

The figure of Jack O’Lantern himself also entered into American lore and become a big part of the holiday, mostly through retold tales, superstitions, and Tim Burton movies. As the story goes, a figure named Stingy Jack tricked the devil several times and made him promise not to claim his soul for hell after he died. However, old Lucifer got the last laugh, because Jack wasn’t allowed into heaven and the devil wouldn’t take him to hell so he was banished to wander the Earth. The Irish began referring to the figure as Jack of the Lanterns, and -again because the human tongue is an orange lasagna loving cat- it became Jack O’Lantern. The Irish and Scottish created turnip jack-o-lanterns to put in their windows on All Hallows Eve in order to scare away Stingy Jack  from entering their house, and rifling through their silverware drawer.

Trick or treating became a combination of pagan and Catholic traditions. “Guising” or “souling,” was where people would go around on All Souls Day, on November 2nd, from house to house offering to pray for the residences’ deceased loved ones. In exchange the homeowners would offer food or bread. However, for the Irish immigrant in the 19th century trick or treating was a lot more about the “tricking” than the “treating.” Quite frankly, we can understand that, we’ve seen Gangs of New York. If you were treated the way many Irish immigrants were treated you would probably want to egg a few houses too. Still Irish hi-jinks can only last for so long before the 1% wants in on the show, and that’s exactly what happened.

The Great Gatsby Pumpkin
It all started back in the roaring 20’s when Halloween parties became all the rage in high society. People with names like Rockefeller and Vanderbilt would dress up for a night of debauchery -which we can only assume included fast cars, loads of booze, and the secretly tortured soul of a a young millionaire just longing to be loved by a single woman. Unfortunately, on the lower rent side the cities, Halloween vandalism and property damage became a real problem. Cities like Los Angeles had to hire thousands of extra cops just to try and catch holiday pranksters. The situation only got more dire during World War II when Halloween tricks were no longer seen as kids being kids. Because of the scarcity of wartime resources, the property damage became known as an un-American affront to the war effort.

Towns did almost everything they could to downplay and discourage Halloween. Truman even tried declaring October 31st to be “Youth Honor Day,” but it didn’t fly with Congress -because even back then Congress was still Congress. Towns literally abolished the holiday, and national pleas were made to keep kids home on Halloween. Cities handed out free movie tickets, donuts, popcorn, and anything they could think of to keep kids from engaging in pranks, but it didn’t work. Kids still soaped windows, let air out of tires, rang doorbells at all hours of the night, and engaged in pretty much any classic prank you can think of and more. Even after the war ended and America was ready to return to festivities, Halloween still took a while to move nuisance to celebration.

In the late 1940’s the media and local governments decided to try and change Halloween by putting more emphasis on the “treat” instead of the “tricks.” However, many residents were still appalled at the fact that kids now came begging for candy or money. There were even reports of hostile residents, with one woman in Miami in 1950 handing out red-hot coins to children -because even back then Florida was still Florida. Police in North Carolina tried handing out 5,000 packages of cookies to kids to dissuade them from knocking on homeowner’s doors. However, those early attitudes would soon change thanks -in no small part- to a massive advertising campaign by the Mars Candy Company and other corporate outlets including television and cartoons. By the late 1950’s Halloween was no longer seen as kids begging, but as a fun holiday that every child deserved to take part in.

The Treehouse of Hornswoggling
By 1958 Halloween was a booming industry, quite literally. The baby boomers were growing and the new middle class -with their new disposable income- embraced the holiday. Parents started spending big bucks on candy, costumes, and parties. Food companies did not fail to notice the growing popularity of trick or treating and the potential it had for profits. Comapnies like Borden, National Biscuit Company, and even Philip Morris –smoke– began capitalizing on the new popular holiday. Companies made an estimated $300 million dollars on Halloween in 1965. It is a trend that has only been growing since, and is showing no signs of stopping.

Currently, Halloween is the second-most commercially profitable holiday behind Christmas. Americans spend an estimated $6 billion dollars each year including decorations, costumes, and candy. In fact, the candy industry rakes in an average of $2 billion alone during October. That is roughly 90 million pounds of chocolate. Somewhere along the line America made the transition from a quaint Irish tradition to a corporate money printing powerhouse, but that may not all be a bad thing. The continual growth and investment has assured that the holiday remains alive and vibrant in American culture. In fact, thanks to popular media and the exportation of American culture, the celebration of All Hallows Eve around the world -including in Ireland and Scotland- has become much more inline with American commercialized traditions than the older Celtic ones. In essence, it has become a uniquely American holiday.

The Millennial generation is going even further with Halloween and in recent years have raised it back to a holiday that can be enjoyed by adults as well as children. We nerds have never been shy about dressing up and acting like Jedi or Mutant Turtles or kids in general, so maybe it is not surprise that Halloween enjoys even more popularity among adults now than it did back in the 50’s or even in the 20’s. After all, we grew up enjoying this scary, spooky, and fiscally profitable holiday. It is only natural that we would want to keep celebrating it regardless of age. Maybe that is why it looks like Halloween may only be stopped, nevermore.

Last week, ABC and Disney premiered their newest show, The Muppets. An adult-oriented behind the scenes look at the lives of the Muppets. It seemed to hit all the right notes and was met with generally positive reviews, but of course it did. Hey, it’s the Muppets. They may just be inanimate objects operated by hands and string, but they are people too, as real as you and Chuck Norris. However, it wasn’t always like that. In fact, the Muppets started from very humble beginnings and from what Jim Henson once called, “ridiculous optimism.

Jim and Friends
In 1954 Jim Henson started working with a partner at the University of Maryland creating puppets for children’s programming airing in the DC area. While working with Jane Nabel -who would later become Jane Henson- Jim created the Muppets, starting with the unforgettable frog himself, Kermit. It is said that Jim Henson coined the term “muppet” as a combination of marionette and puppet. Starting in 1955, Kermit and Rwolf, became regulars on the Sam and Friends show. Initially Rwolf was the more popular of the duo, going on to appear as a sidekick to Jimmy Dean on several episodes of the Jimmy Dean Show, starting in 1963. This was mostly due to Rwolf’s mastery of the piano.

It wasn’t until 1969 with the premiere of Sesame Street that Kermit really found his groove. Kermit was one of the original Muppets to appear on the children’s classic show, and ten years later when Jim Henson decided to create a Muppet television series that could be enjoyed my adults and children alike, it was Kermit the Frog who emerged as the heart and leader of the Muppet troupe. The Muppet Show, first aired on September 5, 1976. A sketch comedy that featured parodies, musical performances, and a flock of big name celebrities. The show was a hit and introduced the world at large to the Muppets including such new characters as Miss Piggy, Fozzie Bear, The Great Gonzo and Animal.

After the success of The Muppet Show, the Muppets went on to make three movies, The Muppet Movie, The Great Muppet Caper and The Muppets Take Manhattan. At nearly the same time, the Walt Disney Company entered into talks to buy the Muppets from Henson -because they have some sort of need to own the world- but the deal fell through with the death of Jim Henson in 1990. The company passed to his son and daughter Brian and Lisa Henson, who in association with Disney produced The Muppet Christmas Carol and Muppet Treasure Island. Disney finally acquired the full rights to the Muppets in 2004, except for the Sesame Street characters, which were sold to Sesame Workshop, and the Fraggle Rock characters which were retained by the Henson Company. The mouse-run organization has since produced The Muppets, and Muppets Most Wanted, along with the new show simply titled The Muppets. They have also produced a slew of award winning YouTube videos -bet you didn’t know YouTube videos could be award winning?

As a result of the Disney purchase the name Muppet became trademarked, which means the NYRD is currently in serious violation of copyright laws, because we unabashedly use that term like forty times throughout this article. It also means that any other creatures created before or after the Disney acquisition could not be called Muppets. Thus creatures like Falkor from Never Ending Story or Pilot from Farscape, are not Muppets. They are just puppets created by the Jim Henson Creature Workshop. Sesame and Fraggle characters have a special exclusive licensing agreement with Disney, so they can still be called Muppets, but that is also why you don’t see Kermit the Frog appear anymore on Sesame Street. Also, please note that Yoda is and has never been a Muppet. He’s a Jedi and there is a difference.

The Wonderful Land of Oznowicz
However, we hardly need Disney’s corporate branding to tell us what is and isn’t a Muppet. There is something special about the lovable group of felt covered creations that just makes them different from other puppets. For example take Miss Piggy… please… Everyone’s favorite pig, was originally meant to be a background character, a generic female pig puppet, but a few months before the start of The Muppet Show, Jim Henson received a request to perform on a TV special with a “young starlet” character. So Piggy was done up, her eyelashes were extended, her hair was changed, and the puppet was given to a young man named, Richard Frank Oznowicz, known mostly by is stage name, Frank Oz. Much like Oz, Miss Piggy was originally born under a different name, Miss Piggy Lee, but she dropped the last name. In a sketch where it was scripted for her to fight with Kermit, Miss Piggy did an impromptu karate chop and the character was born, along with her long standing relationship with Kermit.

That seems to be how the Muppets are really created. They aren’t just made with foam and glue, they evolve. New Muppet characters are often passed around among Muppet performers until one human seems to click with the new creation, and then the character’s personality, voice, mannerism, and more develop from there.  By most accounts the personality of Kermit the Frog seems to be very much based off of Henson himself, as he was the original puppeteer. Maybe that is why each Muppet feels as if they have a unique personality, as if they are really alive. Functionality wise some Muppets are simple, requiring only one person to operate, but then there are others that require an army of humans and technology. Yet, each of them feels like a person. This is partly because Jim Henson was the first person to pioneer the idea that the Muppets were not just puppets controlled by people, but actual creatures.

The Muppets were the first puppet characters to use the TV camera as a framing device. Before that puppeteers were either hidden behind a visible stage on screen, or the puppets sat next to them, like a ventriloquist. With no human operator on screen and no indication of a human presence the Muppets became people unto themselves. It also helps that, unlike other puppets, the Muppets are very articulated. In other words, it is not the movement of the mouth, but the hands, feet, and other appendages that help create the illusion of reality. Humans operators work below the Muppets, using their right hand to operate the mouth and their free hand to operate the Muppet’s arms. As a result, many of the Muppet characters tend to be left handed. This illusion of reality is so strong that we don’t even like to think about humans playing a part in making the Muppets who they are. In fact even talking about their operation in this past paragraph as made us feel wrong inside, and there is a reason for that.

Getting Inside the Muppets
We want to believe that the Muppets are alive. They are our friends and people we grew up with. Did you know there is a concept that something can be real, even if it is imaginary? Dr. Jennifer Barnes has a great TED talk on this very subject. She talks about how we form relationships with fictional characters. We come to empathize with them. They are real to us even at the same time they are imaginary. It’s why we cry when George R. R. Martin works his sadistic magic or why we cheer when  Rocky wins the Cold War. In fact, it has been observed on Sesame Street and in other similar programs that children who interact with Muppets, treat them as living creatures, even if they can see the person who is operating and talking for that Muppet. It is a special kind of suspension of disbelief that our minds can entertain, especially when it comes to Kermit and his friends. It doesn’t matter who has their hand in it, we still identify them as distinct individuals from their controllers, and so does all of Hollywood.

The Muppets have appeared in a lot of things. The Muppets have a star on the Holylwood Walk of Fame, in addition to Kermit’s own individual star. The Muppets have presented at both the Oscars and Emmy’s. They’ve made cameos in various non-muppet movies, including Rocky III. They have had guest appearance on shows like The Cosby Show, and West Wing. They have been interviewed on late night and daytime TV. Kermit the Frog was one of the first guests Jon Stewart ever had during his early days on The Daily Show. They have guest hosted several TV shows including The Tonight Show, Extreme Makeover, and even Larry King Live. They have made numerous public appearances during the Rockefeller Tree Lighting, New Year’s Eve in Times Square, and Kermit even gave a TED talk. Kermit appeared on Hollywood Squares and as a commentator on VH1’s I Love documentary series. All of this contributes to how we see and think of the Muppets. They aren’t just creatures they are working actors and genuine celebrities.

When you watch The Muppet’s Christmas Carol, and you see Kermit the Frog acting as Bob Cratchit, you don’t think, “They made Bob Cratchit a green frog?” No, you think, “Oh Kermit is playing Bob Cratchit,” in the same way you think of Michael Cane as playing the part of Ebeneezer Scrooge. We instinctively see the Muppets as people, even as a part of our brain acknowledges that they aren’t. Maybe it helps that we were introduced to the Muppets as a character troupe on a variety show, but there is something more to it.

What is the difference between a real person and a puppet or a cartoon on TV? They both have personalities. You feel an emotion for both of them? You enjoy their company? Maybe a better question is, what is the difference between Jon of Arc and Miss Piggy? You have never met either of them, unless of course you met Miss Piggy. You probably know more about Miss Piggy than Joan of Arc. You probably feel more attachment for the Pig of the Muppets over the The Maid of Orléans. Yet, of those two it is Joan of Arc who was a real flesh and blood human. So maybe in some sense, Piggy, Fozzy, Gonzo, Rizzo, and Kermit are real in some way. They are obviously more real to us than people we acknowledge as having been actual famous humans. The Muppets are loved, they are respected, and they have a positive impact on the world. If only that could be said of every real person out there.

There is something about the self expression of a tattoo that helps to show our modern individuality, beliefs, loves, and more; and if you are anything like the majority of the staff here at The NYRD -except for Todd- than you too, dear reader, may be sporting some ink. Your mark of independence, creativity, or passion may be in some hidden or not so hidden place on your body, but have you ever wondered why so many people nowadays seem driven to tarnish their perfect skin with poorly drawn tigers or mistranslated Chinese writing? All we know is that we are not alone, because this modern trend is a surprisingly timeless human trait.

Tit for Tats
According to a 2012 Harris poll 1 in 5 Americans, roughly 21% of adults in the United States, have at least one tattoo on their body. According to a Pew Research poll that number jumps to 40% when looking at young adults between the ages of 18 and 29. Tattooing is becoming more popular than Robert Downey Jr. holding ice ream, and for a few reasons. First, tattoos are becoming more and more acceptable as celebrities, athletes, and other notable figures proudly display their ink on the big and small screens. Secondly, the Millennial Generation -who is constantly rated as more confident, connected, and more willing toward self expression- has made tattooing a part of the youth culture. Lastly, and due to all these factors, the stigma of tattoos have lessened over the years. It is not gone completely, but it no longer seems that ink is just for sailors and Sith warriors.

One tattoo artist we talked to explained how he got his start marking gang members and bikers in a dingy ink den on the wrong side of Brooklyn. Originally those were his only customers, but now his clientele are mostly young adults who wear cardigans instead of leather jackets, and get tats of Kermit the Frog instead of skulls and daggers. Of course, this also reflects a bigger trend going on in NYC and around the country. The landscape of Brooklyn has changed, with many neighborhoods going from hard-luck to hipster paradise. Gangs still exist, but they are no longer the only ones who brand themselves to affiliate with a group or ideal. Geeks, jocks, families, chess clubs, and more use tattoos to proudly display who and what they are.

We could argue a correlation between the rise of social media and the popularity of tattoos, and certainly our new cultural of unabashed social sharing and connectivity has added to the popularity of body art. We are more willing to share who we are with friends, family, and strangers, but there is more to it than that. When you look at history, this urge to carve out a visible personification for ourselves with tattoos proves to be quite a universal human tendency.

Faded Ink
Tattoos have been a part of human culture for more than 5,000 years. There has even been evidence of tattooing dating back to 6,000 BCE, with the discovery of a man sporting a thin pencil mustache tattooed on his upper lip, thus also proving that both tattoos and hipsters are apparently timeless. Ötzi the Iceman was discovered in the early 90’s preserved in ice. He lived around 3,300 BCE, had more than fifty tattoos on his body, mostly small vertical lines that may have been used for therapeutic reasons. Every human culture has had a history of decorating their bodies for various reasons, spiritual, religious, love, war, etc. There is evidence that Moses might have had a tattoo, despite what the Bible said on the subject. Even Christian Crusaders often got the Jerusalem Cross marked on their body so that if they died in battle they could be identified and be given a Christian burial. Today, cultures, such as the Maori in New Zealand, still use tattoos to commemorate their heritage and to bridge the gap between their ancestral roots and the modern world.

There is no one true origin for the tradition of tattooing, but we do know where the English word for the practice comes from. Tattoo or Tattow is an Anglicized version of the word Tatau, a Polynesian word from Tahiti. It was brought back to the west by English explorer Captain James Cook, who is mostly remembered for his misunderstanding of the climate of Australia and his misunderstanding of the patience of Hawaiians. In Tahiti, Cook encountered heavily tattooed men and women, and because of his stories and the ink that his crew returned with from their Polynesian vacation, we got the modern word of tattoo.

Also thanks to Cook’s discovery and the stories he and others like him brought back from their voyages, tattoos became all the rage in Victorian society. Most people tend to think of Jane Austen and her ilk as a tame repressed group, but the truth is that many Victorians had at least one tat. Even Queen Victoria was believed to have a tattoo of a Bengal tiger fighting a python. Of course, most Victorians’ ink was hidden by frilly dresses, petticoats, and pantaloons, and there were always some that looked down on the fashion trend, but it was common practice of upper society at the time. It was also said that Winston Churchill’s mother had a rebelliously visible tattoo of a serpent… Hail Hydra.

Cover Ups
Unfortunately, we cannot ignore the darker side of tattoos. Many people were inked or branded unwillingly throughout human history. Geeks and Romans tattooed slaves and mercenaries so as to discourage them from deserting or fleeing their masters. Convicts in Japan were tattooed to mark their lowered status in society, even as far back as the 7th century. Curiously, this could also be why today the Japanese still look down on tattoos as something worn only by gang members and criminals. Most sadly, of course, the Nazis tattooed Jewish and other prisoners in concentration camps with numbers so they could easily identify stripped and destroyed corpses.

Of course, over the years some people have re-appropriated those symbols. The Japanese convicts turned their shameful marks into the elaborate body art which marked their strength and loyalty to the Yakuza, the Japanese mafia. Many descendants of concentration camp survivors have tattooed their ancestors numbers on their bodies as a way of remembering and honoring the past. These marks of shame find new meaning and acceptance in the modern world, but really that is what tattoos are. They are a way to memorialize what has come before and celebrate who we are as individuals.

Tattoos are permanent adornments, remnants from a different time in our life that stay with us. Like us they may grow and wear out, and their meaning may change over time. We may look on them with fondness or embarrassment, but there is not denying that they become a part of us. Many people use them to commemorate a milestone, honor a loved one, or even just because it makes them feel good. Regardless of the reasons, the marks help define us, not just to the outside world but to ourselves. Tattoos are not something new and they are certainly not something to be looked down on. In our modern world of Twitter and Facebook, tattoos help us to literally wear our hearts on our sleeves, or our dragons, our crossbones, our Superman symbols, or even that unicorn we got by mistake that one time, which we don’t ever talk about.

Did you ever wonder who built the pyramids; who helped evolve humanity; who taught us to use fire; who left that surprise for you on your front lawn this morning? Well look no further because all the answers can be found in the stars… or at least we here at The NYRD have been led to believe. Lately, our staff has been watching a lot of Ancient Aliens on the History Channel. “Why?” you might ask. The short answer is boredom, but the longer answer is a combination of the fact that we lost the office remote, and we couldn’t find even one episode of Law and Order on any single channel. Now that is a paranormal occurrence. Also, since the show is beginning it’s 8th season tonight, we thought it was a good time to examine what is passing for history on cable TV these days.

Godwin’s Law of the History Channel
At the very least, watching Ancient Aliens can be entertaining and sometimes even a thought-provoking experience. We’re not saying we believe even a tenth of what is being purported by the show and their so-called “experts,” but there are enough times where the evidence presented seems almost impressive. Even if it does not always prove aliens, it is usually enough to prove that maybe our science and our understanding of history are not as complete as we would like to sometimes believe. It is very good at presenting questions and challenging our standard views on the past. After all, approaching situations and even history from different angles of critical thinking can be a useful tool for learning about ourselves and our world.

Unfortunately, there is always that inevitable moment in every show where the claims just seem to go to eleven. That is usually when the show jumps the shark harder than the Fonz on a rocket-powered pogo stick, and they start talking about how NASA is keeping secrets about extraterrestrials, or that the Mayans and Egyptians were Skyping with creatures from another world, and then something about time traveling Nazis. It always comes back to the Nazis. A member of our staff has described the show as trash reality for history nerds, and it’s hard to argue with that assessment, especially when you look at the cast of characters the show trots out as their “experts.”

The History Channel, bless their little heart, does its best to find actual history and science experts to appear on the program, but usually when you see respectable names with titles like “doctor” and “professor” in front of them they will only ever appear on screen for a few moments to describe something technical or historical in a clearly unrelated way. In fact, it is almost a certainty that nobody even told them what program they were appearing on. Hey, when the History Channel calls who among us would turn down the chance to be a talking head in a documentary.

The crazy of the show usually does not kick up till you hear from people like Erich von Däniken, C. Scott Littleton, George Noory, and the king alien of them all, Giorgio A. Tsoukalos, who is better known from his meme and as that alien guy with the hair. Tsoukalos’ dialogue in particular can usually follow a pretty predictable pattern: “When I hear of [Blank] experienced by our ancestors, I have to wonder, now, is this just a story made up by prehistoric men, or were they in fact describing something they had seen. The answer has to be aliens.”

History Channel, the Drinking Game
Another entertaining bit of dialogue can be found by paying attention to the narrator of the program. You could make a drinking game for every time he is forced to utter the phrase “In the opinion of ancient astronaut theorists… etc.” Apparently, History Channel believes that this often uttered disclaimer by a deep voiced and faceless man is enough to separate them from the show’s sometimes wild theories that are expounded upon in detail and often stated like hard and cold facts. It is almost as if they are still trying to maintain a reputation as a respectable channel of learning.

In actuality, shows like Ancient Aliens are coming to define the History Channel as part of a well established and drunken trend in their programming schedule. Thus, the History Channel has tried to treat shows like Ancient Aliens the same way that communist China tried to treat the more capitalistic Hong Kong. Yes, allowing it to exist has proved profitable, but in both cases the thinking and reputation of the smaller entity has influenced the attitudes and trends of the larger one. In fact, the more you look at the 8 season run of this show, the more you realize how far the History Channel has been allowed the to stray from the beaten path in all that time. In fact, the channel once so dedicated to historical accuracy and documentaries, seem to have left what trail they had been following many years ago and are now deep in the woods digging for sasquatch bones with hillbilly prospectors.

Even the show itself has gone progressively further off the deep end. Where as, earlier episodes tackled the sort of standard subject matter you might expect, (the pyramids, Bermuda Triangle, mythology, etc) later episodes seem to take pleasure in making correlations between aliens and almost anything you can think of, including -and we are not kidding- the old west, zombies, the Revolutionary War, dinosaurs, bigfoot, superheroes, and Da Vinci. In fact, it is sometimes hard to wonder if the writers just have a dart board with random words on it that they use to decide episodes. “Let’s see, today we will make it on Aliens and… *thunk*… the Colonel’s secret recipe.”

All of this really only serves to drag the name of the History Channel further through the mud that in the opinion of ancient astronaut theorists, was probably created when water from Mars mixed with meteor fragments left behind on the boots of visitors from the Orion’s belt. Hey, as long as it turns a profit. Yet, our biggest qualm with the show is not really about the defacing of the History Channel name, as they have been doing that to themselves for years with shows like Doomsday Prophecies, Nostradamus Decoded, and Nostradamus Decodes the Doomsday Prophecies.

We’re Not Saying It’s Bad…
But what really irritates us here at The NYRD is that the show basically gives all the credit for almost any historical human accomplishment to visitors from outer space. By claiming all our history’s marvels, mythology, and technological innovations are a direct result of alien contact we believe that you devalue the human race as a whole. In order to take the show at face value, we are forced to approach it with the understanding that humanity has accomplished nothing on its own except maybe the invention of Facebook, but for all we know there may be an episode on aliens and social media coming out this season. We might, and that’s a big might, be able to entertain that maybe humans had some kind of help -whether it be alien, divine, or otherwise- along the way. However, we need to start giving ourselves a little more credit. Humanity is capable of so much beauty, courage, wonder, and innovation. We have an unlimited potential and that should never be undervalued, especially by a TV channel who is supposed to be teaching us about our past and the tragedies and triumphs that lay within in it.

Yet, with that said, there are a lot of times where we find ourselves taking at least a little inspiration in some of the things presented in the show, if only for its fantastical take on events. As writers and artists, the program should be used as a jumping off point. It is a way to thinking about how history could have unfolded differently. There are a lot of unsolved mysteries out there and it is always good to think outside the box and look at the world from different view points. There is nothing wrong with thinking critically about established dogma. It was Aristotle who said that, “It is the mark of an educated mind to be able to entertain a thought without accepting it.”

Not to be outdone, it was Gorgio A. Tsoukalos who probably said, “When I think of Aristotle as this really smart guy I have to wonder if that’s because he learned a lot or because maybe he was enhanced by beinga from another world. I’m not saying it was aliens… but…”