15 Things About The Past That Used to Totally Suck
Around the turn of the century, there was an unwritten, but arbitrary rule that said you couldn't wear straw hats in New York City past September 15th. Newspapers would warn you about the upcoming date, and if you dared wear a hat past the date, you'd probably get your hat snatched and stomped in front of you.
In 1922, a few youth decided to start this stomping tradition early, on the 13th. They targeted some dock workers that actually fought back instead of letting their hats be stomped. A fight broke out and they even stopped traffic on the Manhattan Bridge before police broke it off.
Despite this, the brawls continued and got worse. Gangs of teens would walk around with large sticks - sometimes even with a nail on the top - looking for people with hats to beat them up. Many people were hospitalized, there were mobs as large as 1000 people who were just looking to stomp on hats!
It’s one of those mysteries of history for which no cause is known. According to written records, for a year to a year and half in AD 536, a thick, persistent dust veil darkened the skies between Europe and Asia Minor. It’s believed that it covered numerous other parts of the world though.
The cloud is known to have gone as far east as China, and even Mongolia, Siberia, Argentina, and Chile. The mysterious dust veil decreased temperatures and brought massive drought and food shortages. As this was right before the smallpox outbreak in Europe, as much as a third of Europeans were left dead.
In China it was even worse. The famine caused by the dust veil killed up to 80% of people. In Scandinavia, it could have been as high as 90%. A Syrian from that time wrote “the sun became dark and its darkness lasted for one and a half years... Each day it shone for about four hours and still this light was only a feeble shadow...the fruits did not ripen and the wine tasted like sour grapes."
There are many other accounts that confirm this mysterious cloud, but little answers. One account claimed it lasted from 24th March 536 to 24th June 537. Today its effects are seen on the rings of trees that have existed since then.
The guillotine was an instrument of execution developed during the French Revolution. It was first presented as a new, more efficient and humane replacement to the gallows, which were extremely unpredictable. Hanging a person always ensured their death, but if the rope was too long, the person’s head could be pulled from their body, and a rope too short would cause them to wriggle for up to 30 minutes during a slow suffocation process.
Joseph-Ignace Guillotin was a French physician who developed and promoted the guillotine, hoping for a more civilized form of execution. Unfortunately, it was so easy that thousands were executed by it. The first person executed by the machine was Nicolas Jacques Pelletier on April 25, 1792.
He was a common criminal and highwayman, and after attacking a civilian and killing him was arrested and ordered to death. The National Assembly had recently ordered decapitation the only “enlightened” form of execution permissible in France, and Pelletier was forced to wait for three months while the first guillotine was being constructed!
When his day came, he was decapitated in front of a rather large crowd who were eager to see the new machine in use. However, almost all of them were disappointed and began to boo. They had come to see a show, not a simple decapitation.
They demanded the return of hanging as a form of capital punishment, yet even with their appeals, the guillotine caught on and made quite the mark in history.
During World War II tensions between whites and blacks escalated in Detroit. Housing was limited in Detroit, as over 300,000 people moved to Detroit (including 50,000 African-Americans) due to the defense industry was booming from the war. Recruiters talked black and white people alike from the South to come up for jobs in Detroit.
Upon arrival, African Americans realized how strong the bigotry was in the north. In June 1943, Packard Motor Car Company promoted three black workers to work alongside whites on the assembly line. The promotion caused 25,000 workers to walk away from the job. It effectively slowed down the critical war production.
White workers didn’t mind black people working at the plant, but they didn’t want them working side-by-side with them. Subsequently a riot broke out on Belle Isle that lasted for three days. The federal troops had to come in and restore order.
Maybe this’ll cure your post-Thanksgiving blues. Has anyone ever asked you whether you’d like to have “dark meat or white meat” after carving the turkey. These terms have nothing to do with the colour of the meat, they were euphemisms for the leg and breast of turkey and other fowl.
In the Victorian times, the words “leg” and “breast” were considered fowl ;]. So they awkwardly decided to call the leg “white meat” and the breast “black meat.” Gobble up some other turkey facts:
• Benjamin Franklin wanted the turkey to be the national bird of the US
• Abraham Lincoln issued a 'Thanksgiving Proclamation' on third October 1863 and officially set aside the last Thursday of November as the national day for Thanksgiving.
• Sarah Josepha Hale, an American magazine editor, persuaded Abraham Lincoln to declare Thanksgiving a national holiday. She is also the author of the popular nursery rhyme "Mary Had a Little Lamb"
You may recoil in horror when learning this, but it's true. Archaeologists have found that ancient Romans committed one of the worst (modern) fashion crimes: wearing socks with sandals. Researchers recently uncovered a Roman industrial area. In their digs, they found a 2000-year-old sandal, which showed a clear impression of material fibers. Socks!
This archaeological site is significant for more than just critiquing Roman fashion sense. It's one of the only Roman industrial sites that have been uncovered. The site had a water-powered flour mill that was used to grind grain and produce for feeding soldiers, making clothes and pottery.
This is going to get creepy fast. In the 1880s and 1890s the family of George and Mary brown of Exeter, Massachusetts suffered a sequence of tuberculosis, called consumption at the time, infections. Mary, the mother, was the first to die and then their eldest daughter, Mary Olive, died in 1888. Their son, Edwin, then caught the infectious disease in 1890. Sadly, in 1891, another daughter, Mercy, became infected and died of the disease in January of 1892.
She was buried in the Baptist Church cemetery in Exeter. People began talking about one of the family members being a vampire, as folklore went at the time that if multiple family members died of a disease, then a family member must have been involved in undead activities. George Brown was persuaded to exhume the bodies of his family members in March 1892. While his wife and daughter, Mary Olive, were considerably decomposed, Mercy was still quite preserved and still had some blood in her heart, because they didn’t embalm most people back then.
So, the villagers took that as a sign that Mercy was a vampire and the reason Edwin was sick. Mercy’s heart was removed from her chest, burned, and the ashes mixed with tea and given to Edwin to drink to cure his ailment. He died two months later.
In the late 1800s in Chicago, Dr.Holmes, built a 60-room hotel. It contained all the amenities of luxury living; doors that led to nothing, rooms without windows, and hidden passageways. For four years, Holmes held various guests prisoner, torturing and eventually killing them. Some rooms were sealed shut and used as gas asphyxiation chambers, others were lined with iron plates and had blowtorches built into the walls to burn his victims.
The prison rooms had rudimentary alarm buzzers to alert him if anyone tried to escape. The thing that eventually lead to Holme’s arrest was not the blood-splattered surgical table, or his jars of poisons and boxes of bones, or even his very own crematorium; it was an insurance fraud scheme. The murders were revealed when police conducted a search of the hotel.
He was hung for his crimes but never showed remorse- he claimed he was possessed by the devil. In the 1930’s, the building was burned to the ground and replaced with a post office.
Only 20% of the population agreed with this landmark Supreme Court case!
While today even the thought of whether interracial marriage should be legal sounds obvious, the America of 1967, when it was deemed unconstitutional to bar interracial marriage, was very different from the America of today.
Back when the law was passed racial tensions ran high, much higher than today as thousands gathered all over the country to protest the blatant inequality of the time.
And while those in law and government were open to a new, more equal America, most Americans were not.
Just four percent of the population approved of interracial marriage in 1958, and while the number grew to 20% by 1968, 4/5ths of the country still dreaded the idea. In fact, a majority of Americans didn't approve until 1997!
What is even more amazing is the progress that has been made since that time.Today nearly everyone agrees with the change. The most opposition is found among seniors who were raised during a time of increased racial tensions and segregation, yet among the 18-29 year-old group 97% approve.
Today 84% of America approves of the change, and while it still may baffle some that 16% of the country doesn't, it is mostly seniors, leading statisticians to believe within the next several years acceptance will be just about unanimous.
Many expect the same thing to happen with another hot-topic marriage issue today, marriage between two men or two women.
So, Utah women gained the right to vote…twice. In 1870, women were given the right to vote by territorial legislature for the first time. Women didn’t even have to put forth any effort to get the right in Utah. It was simply given to them. A group of men who’d left the Mormon church were advocating for it and at the same time a group of anti-polygamists on the East were advocating for it, believing that women in Utah would vote to end plural marriages or at least to give them a voice.
After the women of Utah had the right to vote, Congress snatched it away from them through the Edmonds-Tucker anti-polygamy act in 1887. The women of Utah had done exactly the opposite of what the East coast anti-polygamy group thought they would do. They were actually voting in ways that promoted polygamy and favored the lifestyle. They were finally given the right again when women’s suffrage got into full swing and the state of Utah, not a territory any longer, wrote it into law in 1895.
If you know a bit about the history of photography, you might know that it got off to a rocky start. While today photographs can be taken in less than a second, when photography was first being developed as technology, it took hours to get a photograph. In order to capture anything on film, the early cameras of the 1820s had to take several hours.
Throughout the 19th century, the technology developed into something much more useful and efficient. The exposure time was eventually cut down into just a few minutes, and then a few seconds before reaching where it is today. Photographing people was a bit of an ordeal, understandably. For adults, a long exposure time was less of a problem.
Any adult can sit still for a period of time, though typically these photographs were taken without smiles simply because it’s hard to give a convincing smile over a long period of time. For children, the mothers often had to hold them still. What this often meant was that mothers had to be hidden, disguised as chairs or (as pictured on the right) just sitting under a sheet.
Of course this was only when the pictures were meant to be just of the children. Another method of photographing children is considerably more eerie. Sometimes, children would be photographed after they died, just so that they would be sitting still.
They are commonly called the “ugly laws,” but some cities called them the “unsightly beggar ordinances.” The idea of the laws was not to be inhumane, but to preserve the overall quality of life of the community. In a sense, they were their times version of a homeowners association. San Francisco was one of the first cities to create an ugly law in 1867.
It grew ever popular among western and Midwestern cities. The whole state of Pennsylvania adopted ugly laws in the 1890s, though. Other cities that embraced the ugly laws were Chicago, Columbus, Ohio, and Omaha, Nebraska. Sadly, the laws that tried to prevent maimed, mutilated, and unsightly deformities weren’t eradicated in many cities until the 1970s!
Omaha was the first to throw out their ugly laws in 1967 followed by Columbus in 1972. Chicago was the last to follow suit in 1974. Any city that might have accidently had ugly laws lingering had them dismissed when the Americans with Disabilities Act was passed by congress in 1990.
Today peanut butter is a food seen in almost every home in America, but it wasn’t always that way. Peanuts are native to the Americas and since Aztec times have been made into a paste to be eaten. Modern peanut butter originated in the late 1800’s with the first patent dating back to 1884, but it was much runnier than modern versions.
Dr. John Harvey Kellogg patented another paste in 1895 that is much more similar to what we see today and served it to patients at the Battle Creek Sanitarium as a health supplement. It was originally so expensive that it became a staple of luxury in the early 1900’s. It was commonly served in upper class tearooms that populated New York and was paired with a wide array of foods such as cheese, celery, watercress, pimento, and crackers.
The first reference of peanut butter paired with jelly is from a recipe by Julia Davis in 1901, and by 1920 the sandwich caught the attention of less wealthy members of society and spread peanut butter around the nation.
As the price of peanut butter lowered, it became extremely popular with children and today it is one of the most widespread food items in America. In fact, the spread is so popular there is even a National Peanut Butter Day on January 24th!
When someone thinks of classical music they most likely think of old men with wigs sitting on a stage in complete silence before playing an extremely long boring piece of music. A modern classical music concert wouldn’t exactly be described as upbeat as audiences sit in utter silence, with a cough often times being the most obnoxious act thought imaginable, before clapping for one of two minutes and commencing to sit in silence again.
Yet classical performances used to be almost completely opposite. The way we look at many of our musical idols today is the same way people thought of classical musicians in the 19th and early 20th centuries. Concerts used to be packed with the aristocracy as the latest, most fashionable, composer revealed his newest masterpiece.
However, the aristocracy at the time rarely observed concerts as we do today. They were described as extremely rowdy with people standing in all the alleyways carrying complete conversations while the musicians performed. In fact, there are many reports of people screaming at Franz Liszt as he performed to play another piece!
Concerts became stuffy and rigid as the aristocracy lost power and the bourgeoisie attempted to prove they could be just as cultured by intensely appreciating the music before them, leading to the belief that a word uttered while musicians played resulted in certain death.
Doctors have always encountered the problem of how to best tell their patient of a terminal sentence. Recently, medical professions have been more upfront about tragic news such as this, but it wasn’t always like that. Physicians used to think that by not telling a person they were dying, it would boost their moral and increase their hope.
Amazingly, in 1961 only 10% believed it was correct to tell a patient of a fatal cancer diagnosis! This changed quickly however after studies were done that revealed nearly 90% of patients said they would like to know the truth of their ailments. By 1979, physicians had completely reversed their beliefs and a survey revealed that 97% felt full disclosure was the correct course to take.
Surprisingly, many changed their beliefs to the mentality that the role of doctors was simply to treat their patients with full honesty, not play the psychiatrist.