One Egyptian company is keeping the 5,000-year-old art of papyrus-making alive – here’s what it’s like inside

  • We visited an Egyptian business that’s one of the last places in the world to make papyrus paper.
  • Ancient Egyptians invented papyrus paper around 5,000 years ago, but the art is almost completely lost today.
  • One village is keeping the ancient tradition alive, though the pandemic is hurting business.
  • See more stories on Insider’s business page.
Read the original article on Business Insider

Prehistoric cavemen starved themselves of oxygen to induce hallucinations and inspire their ancient paintings, study finds

cave paintings
Prototype of painting of the facsimile of the Chauvet cave, which contains some of the earliest known cave paintings Arc, in Vallon-Pont-d’Arc.

  • Prehistoric cave dwellers living in Europe starved themselves of oxygen to make art, researchers say.
  • An Israeli study found that cavemen purposefully did this to help them interact with the cosmos.
  • The study explains why so many ancient paintings are deep inside cave systems.
  • Visit Insider’s homepage for more stories.

Prehistoric cave dwellers living in Europe purposefully starved themselves of oxygen to hallucinate while creating their decorative wall paintings, a groundbreaking new study has found.

Researchers have been questioning for years why so many of the world’s oldest paintings were located in often pitch-black tunnel systems, far away from cave entrances.

But a recent study by Tel Aviv University now reveals that the location was deliberate because it induced oxygen deprivation and caused cavemen to experience a state called hypoxia.

Hypoxia can bring about symptoms including shortness of breath, headaches, confusion, and rapid heartbeat, which can lead to feelings of euphoria, near-death experiences, and out-of-body sensations. The team of researchers believes it would have been “very similar to when you are taking drugs”, the Times reported.

Read more: These 3 student influencers are earnings thousands of dollars on YouTube by posting videos about exam tips and study hacks

“It appears that Upper Paleolithic people barely used the interior of deep caves for daily, domestic activities. Such activities were mostly performed at open-air sites, rock shelters, or cave entrances,” the study says, according to CNN.

“While depictions were not created solely in the deep and dark parts of the caves, images at such locations are a very impressive aspect of cave depictions and are thus the focus of this study,” it adds.

According to Ran Barkai, the co-author of the study, the cavemen used fire to light up the caves, which would simultaneously also reduce oxygen levels. Painting in these conditions was done deliberately and as a means of connecting to the cosmos, the researcher says.

“It was used to get connected with things,” Barkai told CNN, adding that cave painters often thought of the rock face as a portal connecting their world with the underworld, which was associated with prosperity and growth. The researcher also suggested that cave paintings could have been used as part of a kind of initiation rite.

The fascinating cave paintings, which date from around 40,000 to 14,000 years ago, depict animals such as mammoths, bison, and ibex.

“It was not the decoration that rendered the caves significant but the opposite: The significance of the chosen caves was the reason for their decoration,” the study reads, according to CNN.

The study focused on decorated caves in Europe, mostly in Spain and France. It was published last week in the scientific journal “Time and Mind: The Journal of Archaeology, Consciousness, and Culture.”

Read the original article on Business Insider

Amtrak Joe: A brief look at President Biden’s long history of supporting America’s railroad

Vice President Joe Biden Amtrak Logo 2009.JPG
President Joe Biden at an event announcing funding for Amtrak as part of the American Recovery and Reinvestment Act in 2009.

  • President Joe Biden’s long political history included years of advocating for Amtrak funding.
  • Biden earned the nickname “Amtrak Joe” as he commuted between Delaware and Washington for decades.
  • The nickname hit mainstream media in 2008, starting with CNN.
  • See more stories on Insider’s business page.

When President Ronald Reagan in 1981 moved to trim $884 million from a budget used by Amtrak, Senator Joe Biden was the only member of the Senate Budget Committee to vote against Reagan’s plan.

“You can’t come back next year or the next year and change it,” Biden said, according to a report from United Press International. “Those railroads will have gone.”

Now, four decades later, Biden’s in the seat once held by Reagan. And he’s announced a $2 trillion infrastructure plan, which would include $80 billion for Amtrak. The money would go toward expanding and fixing the country’s crumbling railway infrastructure, which he’s fought in favor of for his whole career in Washington.

It’s often said that Biden’s nickname is “Amtrak Joe,” although it’s difficult to pinpoint when that nickname started to solidify.

In the late 2000s, as Biden joined Barack Obama on the presidential ticket, the nickname started popping up regularly on CNN. The first record that Insider could find of a prominent news outlet using “Amtrak Joe” was from August 2008, when CNN’s Soledad O’Brien called him by the nickname on air.

“Coming up next, more on the Washington insider who is also a proud Delaware outsider. They called him the Amtrak Joe Biden. God, I have seen him on Amtrak a lot,” O’Brien said as she threw to a commercial, according to a transcript.

The following month, The New York times published a blog post using the nickname.

We’ve combed newspaper archives dating back to Biden’s early days as a senator, pulling some of his long-ago quotes about Amtrak. Here’s a brief history of Biden’s interactions with Amtrak.

In October 1970, President Richard Nixon signed the Rail Passenger Service Act to create Amtrak, which was then called the National Railroad Passenger Corporation, according to Amtrak’s official history.

Three years later, Biden entered office.

2011 02 08T120000Z_1641923879_GM1E7281QJQ01_RTRMADP_3_BIDEN.JPG
Biden wearing aviators on a train.

During his decades in the Senate, Biden commuted home to Delaware each day via Amtrak to be home with his sons at night. CNN estimated he took about 8,000 round trips on the same route.

Throughout the 1980s, Biden’s name popped up in budget stories about Amtrak. He often butted heads with Reagan about railroad spending. In May 1985, for example, Reagan had proposed slashing Amtrak’s budget. Biden at the time said the cuts were “a creeping regionalism,” according to The Providence Journal.

“I’m really beginning to wonder if we’re seven regions or one country,” he said. “Why should we help? I’ll tell you why we should help: We’re Americans. A simple reason.”

In 1987, when Biden launched his bid for the Democratic nomination for president, he chose a Delaware train station as his backdrop, according to UPI.

Amtrak Acela Announcement.JPG
This artist’s 1999 rendering of the Acela.

Amtrak announced in the late 1990s that it was developing a high-speed rail for the north-east, called the Acela. Biden said it was “the single most important transportation need in America,” according to an article in the Philadelphia Inquirer.

“That would be hundreds of thousands of tons of pollution,” Biden said, according to the report. “Amtrak is important not only because it helps our quality of life. It literally impacts our health.”

A few years later, after Obama won the presidential election, the first and second families travelled together via the railway to the inauguration.

Bidens and Obamas on a Train in 2009.JPG
The Obamas and Bidens on their way to Washington for Obama’s inauguration in 2009.

As vice president, Biden was often sent to blue-collar states to campaign for Obama’s reelection, using the political skills he’s honed riding the train for all those years, as The Daily Beast reported in 2012.

“This is, after all, a guy famous for making friends with anyone and everyone – fellow travelers, train conductors, red caps – he crossed paths with on his old Amtrak commute from Delaware,” the magazine said.

Biden’s 2020 presidential campaign was also interwoven with Amtrak. During 2020, he travelled by train to several states, making whistle-stop speeches as he went.

Biden Speaking in front of Amtrak Train Ohio 2020.JPG
Biden speaks to supporters during a campaign stop in Ohio in 2020.

He’d planned to take a train to Washington, as he’d done with Obama 12 years earlier but cancelled the trip amid security concerns, after rioters mobbed the Capitol.

In a statement, Biden’s team said: “In the week since the attack on Congress by a mob that included domestic terrorists and violent extremists, the nation has continued to learn more about the threat to our democracy and about the potential for additional violence in the coming days, both in the National Capital Region and in cities across the country. This is a challenge that the President-elect and his team take incredibly seriously.”

When Biden and UK Prime Minister Boris Johnson held their first trans-Atlantic phone call, some of their conversation reportedly focused on a mutual love of train travel.

More recently, a few days after announcing the infrastructure deal, he said. “Imagine a world where you and your family can travel coast to coast without a single tank of gas, or in a high-speed train, close to as fast as you can go across the country in a plane.”

Amtrak Connect US Map 2021 March
Amtrak Connects US, the railway’s vision for train travel in the US in 2035.

Amtrak published a map of an expanded US rail network based on Biden’s funding proposal. Materials prepped for the announcement said the plan would bolster transportation options for diverse populations throughout the country.

The new routes include cities that haven’t before been connected to the national rail service, including western outposts like Las Vegas and Phoenix.

It would also break ground on routes throughout the southern US, including ones to Nashville, Tennessee; Montgomery, Alabama; and Macon, Georgia. Materials prepped for the announcement said the plan would bolster transportation options for diverse populations throughout the country.

“Millions of people, including large populations of people of color, do not have access to a reliable, fast, sustainable, and affordable passenger rail option. This is neither fair nor equitable,” the railway said.

Read the original article on Business Insider

Discovery Plus is the perfect streaming service for fans of nonfiction shows, but its interface needs some work

If you buy through our links, we may earn money from affiliate partners. Learn more.

Discovery Plus Review wide
  • Discovery Plus offers all your favorite Discovery, TLC, Animal Planet, Food Network, and HGTV shows.
  • The service costs $5 a month with ads, or $7 a month without commercials.
  • The interface has flaws, but the low price makes it a great option for fans of Discovery’s networks.

Table of Contents: Masthead StickyFree Trial for Verizon Unlimited Customers (small)Monthly Plan (ad-supported) (small)

Discovery Plus is one of the latest services to throw its hat into the streaming wars.

The platform launched in the US on January 4 and is the streaming home for programs from Discovery Channel, TLC, Animal Planet, Investigation Discovery, Travel Channel, HGTV, and Food Network. It also features dozens of new Discovery Plus Originals and handy 24/7 streaming channels for popular shows.

Discovery Plus plans are priced at $5 a month with ads or $7 a month for commercial-free streaming. Select Verizon customers can even get a full year for free.

I signed up for the ad-supported plan and tried the service for about a week to see how it performs. While browsing through the Discovery Plus library, it’s been exciting to circle back to classic shows and discover brand-new series, but the interface could still use some work.

Price, plans, and deals

Discovery Plus costs $5 a month for ad-supported viewing, or $7 a month for commercial-free streaming. Each plan comes with a free seven-day trial.

Verizon is also giving select customers up to 12 months of Discovery Plus for free. The promotion is for the ad-free plan and is available to new Fios customers, new 5G internet customers, and all unlimited phone plan customers.

Free Trial for Verizon Unlimited Customers (small)

Unlike some other services, Discovery Plus only offers monthly plans, so there’s no discount if you pay for a full year up front. The service does offer gift cards, however, for 12 months ($83.75) or six months ($41.75) of its ad-free plan.

Content library

Discovery Plus Review titles

Discovery Plus understands its lane, offering a big library of shows designed to cater to fans of reality TV, nature programs, cooking, true crime, and education. People who like scripted dramas and comedies, however, will have to look elsewhere. Discovery Plus is purely a service for fans of Discovery networks, and in that sense, its catalog offers a lot of value.

With classic titles pulled from all of Discovery’s brands and networks, the service is jam-packed with blasts from the past, including “Dirty Jobs,” “The Crocodile Hunter,” “Mythbusters,” and “Unwrapped.” There are also dozens of new Discovery Plus Originals, like “Cocktails and Tall Tales,” “Luda Can’t Cook,” and “Monster Garage.”

The original shows, in particular, have creative concepts – like rapper Ludacris learning how to cook different dishes. As a whole, the entire lineup features a solid selection of quality shows centered around the home, food, and history.

By our count, we found 91 programs listed as “Discovery Plus Originals” either with full episodes or trailers indicating they are coming soon. Not every original program was under the Discovery Plus Originals category, however, as we found at least 12 under the platform’s tab for the upcoming Magnolia Network (rebrand of DIY Network).

Interface

Discovery Plus Review show page

The Discovery Plus interface works well enough but it doesn’t have any standout characteristics and it can feel bulky and hard to navigate. It’s particularly confusing to find the platform’s 24/7 streaming channels, as they aren’t on the sidebar or top menu like you’d expect.

The platform’s sidebar contains the basics for exploring the catalog, including a tab for browsing shows, a tab for saved programs (“My List”), and a search bar. The “Browse” tab allows you to find content by its original channel and also by category such as “True Crime” and “Lifestyle.”

To get to the 24/7 channels, you need to scroll down a bit on the “For You” page. This is also where you’ll find rows for the platform’s themed collections. The 24/7 channels are a terrific inclusion for binge-watchers as they let you stream non-stop episodes of “House Hunters,” “Property Brothers,” and “90 Day Fiancé” – I just wish they were easier to find.

One of the redeeming features of the Discovery Plus interface is its helpful recommendation feature. When visiting a show’s main page there are up to 20 additional series recommendations to explore, making it easy to jump into similar programs you might like.

I was hopeful that the platform’s “My List” would easily compile all my favorite content in one place, but it turns out that this feature doesn’t let you save individual episodes. In addition to a limited “Recently Watched” section – which is only accessible on the “For You” page – there isn’t a place to queue up specific episodes I want to watch over the span of a few hours. This makes the experience difficult as I have to go back and sort through shows to find episodes.

Format support

Discovery Plus does include some 4K titles for subscribers who have the necessary gear, but it’s not easy to find these programs as there’s no specific 4K category in the main menu.

The platform requires you to search “Ultra HD” to find this content. After you do this, an “Ultra HD” tab will appear in the results. Once you click on a 4K title, a “UHD” icon will appear on its page to let you know that it plays in 4K. That said, the icon may not appear on certain devices.

While most of the UHD content is limited to a few nature programs – such as “Planet Earth II” and BBC’s “Dynasties” – I was surprised to find programs like “NASA Mars Landing,” “Misfit Garage” and “American Titans” in 4K quality. On the downside, unlike Netflix and other popular 4K services, Discovery Plus does not support HDR.

Devices and features

Discovery Plus is compatible with most media players and smart TV platforms, with the exception of PlayStation devices.

While Discovery Plus is relatively new and will likely roll out new features throughout 2021, there are already obvious faults that I hope it fixes.

One of these is the lack of offline downloads, making Discovery Plus the only major streaming service that doesn’t provide this feature.

Aside from that drawback, Discovery Plus is mostly on par with other platforms thanks to its 4K content and support for up to four simultaneous streams (just like Disney Plus). Its starting plan is also one of the cheapest streaming options there is, at just $5 a month. That’s comparable to Apple TV Plus and Peacock Premium.

Should you subscribe to Discovery Plus?

Discovery Plus’ low price makes it a solid option for big fans of reality TV, cooking shows, nature docs, and other nonfiction programs. It’s also an affordable choice if you’re looking for a secondary service to supplement your primary streaming platform, whether that be Netflix, HBO Max, Prime Video, Disney Plus, or Hulu.

Personally, I’d tack an ad-supported Discovery Plus plan onto a Disney Plus bundle. With this combo, you’ll get scripted shows and movies from Hulu and Disney Plus, sports from ESPN+, and nonfiction programs from Discovery for a total of $19 a month.

Discovery Plus, even with ads, is worth it for fans of its cable channels. The exclusive programs, in addition to its extensive on-demand library, can lead to hours of entertainment, even if there are hiccups with the interface.

The bottom line

discovery plus

Discovery Plus is still new, which gives it time to sort out its flaws, especially its limited “My List” feature and lack of downloads. These flaws make the catalog of shows and documentaries difficult to explore to the fullest extent.

But beyond these faults, Discovery Plus knows what its networks do best: offer entertaining and informative content about our homes, lives, history, and natural world.

Fans of scripted TV dramas and comedies will have to look elsewhere, but shows like “Guy’s Grocery Games,” “Anthony Bourdain’s No Reservations,” and “Ghost Adventures: Cecil Hotel” are binge-worthy adventures for anyone bored with fictional dramas.

For full details on how we evaluate streaming services, check out our review methodology here.

Pros: Diverse content for fans of nonfiction TV, great for binge-watching, very affordable, tons of original programs

Cons: User interface is bulky, no ability to queue and save specific episodes, no offline watching, no HDR support

Free Trial for Verizon Unlimited Customers (small)Monthly Plan (ad-supported) (small)

Read the original article on Business Insider

Vintage photos show how the role of women in the workforce has evolved in the last 100 years

woman computer 1970
A woman works at an early model desktop computer made by Servus, circa the 1970s.

  • During the early 20th century, women’s employment was affected by war and advancements in tech.
  • In the 1960s and 1970s, women were able to expand their horizons and career opportunities. 
  • Vintage photos from the past 100 years show how their roles have changed.
  • Visit Business Insider’s homepage for more stories.

Working women have come a long way in the last 100 years. 

In the 1920s, women entered the workforce in astonishing numbers as a result of the industrial revolution. 

Then, as men were sent off to war, more women got involved in the wartime effort in factories and other professions previously dominated by men.

Women’s equality movements throughout the 1960s and 1970s gave even more opportunities to working women, and in recent years, more women were in the US workforce than men. However, the coronavirus pandemic has caused the women’s labor force participation rate to hit a 33-year low.

Here are 28 vintage photos that show how the role of women in the workforce has evolved in the last 100 years.

In the wake of the industrial revolution, more women than ever began to leave the household and go out to work.

women working 1920
Women postal workers at a sorting office, circa 1920.

Women held jobs as postal clerks, sorting letters and packages. While it wasn’t uncommon for women to work in post offices, very few women actually delivered mail. According to USPS, in 1920, only 5% of the nation’s 943 village carriers were women. 

As village delivery was gradually phased out in favor of city delivery, a majority of the remaining women village carriers either resigned from their positions or were transferred to clerk positions.

Many women also began working in factories.

women working 1920
Operating room of Ladies Rayon Undergarment Factory, 1920.

In 1920, women made up about 20% of the labor force, and many of them were involved in the manufacturing of apparel, food, and tobacco products.

Women of color, on the other hand, were largely employed in agriculture and domestic service work for much of the early 20th century.

During World War I, women held occupations in domestic and personal service, clerical occupations, and factory work.

woman secretary typist 1921
Woman working in an office, USA, circa 1921.

Many women learned to type in order to secure higher-paying jobs in an office as a secretary or a typist in a clerical office, rather than having to work in a factory. According to the Encyclopedia of Chicago, working conditions, wages, and hours in clerical work were seen as the best at the time.

Clerical work attracted young, literate, mostly white women who would work as typists until they were married, only to be replaced by another young unmarried woman.

After the Women’s Bureau was established in the US Department of Labor on June 5, 1920, women had even more opportunities in the labor force.

actress gertrude olmstead 1920
Gertrude Olmstead from the MGM Studios checks the costume design with the dress being made by the costumer.

As the popularity of silent films began to rise, women also found work creating movies for the silver screen.

In 1923, “Business Woman” published a list of 29 different jobs that women held in the film industry, apart from actresses. Job positions included that of a typist, secretary to the stars and executive secretary, costume designer, seamstress, telephone operator, hairdresser, script girl, film retoucher, title writer, publicity writer, musician, film editor, director, and producer, among others.

Women also held jobs as blacksmiths and worked on vehicles.

female blacksmith 1920
A female blacksmith at work in her workshop, circa 1920.

However, most occupations were seen solely as a precursor to marriage. Among married white women of both native and immigrant backgrounds, only around 10% held jobs. It was more common for married women of color to hold jobs, however, out of pure financial necessity. 

Unemployed women during the Great Depression could join “SheSheShe” camps.

women work camp
Several women attend a work camp at Bear Mountain for unemployed, homeless, and single women during the Great Depression.

Inspired by the Civilian Conservation Corps, which only allowed men to join in exchange for free room and board, Eleanor Roosevelt started “SheSheShe” camps as a way for women to gain employment in environmental conservation as well.

Many families during the Great Depression were able to achieve middle-class status by adding another working member to the household – in many cases, a woman.

women world war ii
Women sewing clothes to be sold during the Great Depression, North Platte, Nebraska, November 1937.

Many women during the Great Depression found work as secretaries, teachers, telephone operators, and nurses. Women also made an income by sewing clothes in Works Progress Administration (WPA) sewing rooms, which manufactured men’s trousers, boys’ coveralls, baby clothes, dresses, and diapers. 

During World War II, women assisted in manufacturing wartime necessities like gas masks. By 1945, one in every four married women worked in jobs outside the home.

women world war ii
Women working in a gas mask factory, 1940.

According to Forbes, between 1940 and 1945, female participation in the US workforce increased from 27% to nearly 37%. 

Before the war, women were in traditionally “female” fields such as nursing and teaching. By 1943, women made up 65% of the US aircraft industry’s workforce.

woman world war ii
Woman working on a Vengeance dive bomber, using a hand drill.

After Pearl Harbor, many women entered the armed forces at astonishing rates. In 1943, more than 310,000 women worked in the US aircraft industry, making up 65% of the industry’s total workforce. Before the war began, women made up just 1% of the industry.

In 1935, women made 25% less than men for government jobs. In 1942, even though the War Labor Board required these women to be paid the same as men, the war ended before they could receive equal pay.

nurse 1940s
A nurse lighting the pipe of US pilot Harold Ingley, lying on a field hospital bed in Italy, September 1, 1944.

In 1935, a law titled the National Recovery Act required women who held jobs within the government to receive 25% less pay than men in the same jobs, according to the National Committee on Pay Equity. During wartime in 1942, the War Labor Board ruled that women would be paid the same as male workers who were now away at war.

However, the war ended before the rule could be implemented. With no laws to protect female workers from pay inequality, female workers in the 1940s earned around 60% of what their male counterparts made.

Women were largely seen as “supplemental” workers in the 1950s, meaning their income was secondary to their husband’s.

waitress
A waitress serving men breakfast in the 1950s.

Even though there were technically more women in the workforce in 1952 than during the war, women were not taken seriously in regards to their careers.

Women returned to stereotypically “feminine” jobs – in some cases, jobs were advertised as for women only.

businessman secretary
A businessman and a secretary in the 1950s.

Many women were forced to give up the jobs they had worked in during wartime to male soldiers returning home. The most popular jobs for women during the 1950s were secretaries, bank tellers or clerical workers, sales clerks, private household workers, and teachers, according to The Week.

Female secretaries in the 1950s gained a reputation for being young and attractive. In fact, a 1959 quiz from a secretarial training program in Waco, Texas, asking women if they have what it takes to be a secretary includes “smiling readily and naturally” and being “usually cheerful” among its requirements. 

The 1950s marked the beginning of the “jet age,” and many young women found work as flight attendants, then called “stewardesses.”

flight attendant 1950s
Regular Delta C&S Stewardess Mary Lee Shultz, of Memphis, adjusts a colleague’s cap as they both prepare for flight in the operations room, 1956.

Flight attendants during the 195os became symbols of the golden age of flying — when traveling by air was seen as the height of sophistication and glamour. However, with this “glamorous” career also came a host of sexist protocols.

According to Conde Nast Traveler, women were not allowed to work as flight attendants after they reached the ages of 32 to 35, while male flight attendants could work well into their 60s. In 1957, Trans World Airlines dropped its no-marriage rule for female flight attendants. However, many airlines continued to only hire non-married female flight attendants.

While many women joined the workforce, they were nevertheless expected to fulfill their duties at home, in what would be coined “the second shift.”

50s housewife homowner buying a house
An American housewife in 1960 demonstrates the cleaning power of Vel detergent for a TV advert.

After women returned home from their secretarial or office jobs, they had another job to do — caring for the children, doing the housekeeping, and, of course, putting a hot dinner in front of their husband. 

This became known as the “second shift.” If women didn’t hold office or other jobs during the day, they were relegated to being “housewives.”

In the 1950s and 1960s, women found creative ways to make their own incomes from their homes.

tupperware party
Tupperware party scene, 1960.

Many suburban women began selling Tupperware out of their own homes in what became known as “Tupperware parties.” 

“Tupperware … took those moms out of the kitchen where they were ‘supposed to be’ and let them enter the workforce, and let them have something outside the home,” Lorna Boyd, whose mother Sylvia was an at-home Tupperware seller in the 1960s, told the Smithsonian Institution.

Women were also making history in their careers.

barbara walters
American broadcast journalist Barbara Walters eats a sandwich as she works at her desk in New York in 1966.

In the 1960s, Barbara Walters was a broadcast journalist working in New York City. In 1976, she would become the first woman to anchor a nightly newscast. Many other women were also joining the journalism field as coverage of the Vietnam War became increasingly widespread.

While technology-based and other computer programming jobs may now be dominated by men, the same jobs were considered “women’s work” in the 1960s.

women computers 1960
Women weave hair-like wires and tiny metallic cores into memory at the Ampex computer products division circa 1960.

According to Smithsonian Magazine, “computer girls” became a term for “savvy young women” pursuing careers in computer programming. Computer programming was seen as “easy work” similar to typing or filing, so many women ended up building the field that would come to be known as software development. 

Women soon made up a majority of the trained workforce in the computing industry.

woman working 1960s
A woman working on a Honeywell tape drive computer, circa 1969.

However, the work was seen as “unskilled.”

“Women were seen as an easy, tractable labor force for jobs that were critical and yet simultaneously devalued,” technology historian Marie Hicks said in her book “Programmed Inequality,” according to The Guardian

In the 1960s, multiple pieces of legislation were passed to protect women in the workplace from discrimination.

equal pay 1960s
An equal pay for women demonstration in Trafalgar Square, London, 1969.

Title VII was added to the Civil Rights Act of 1964, protecting workers from employment discrimination based on race, color, religion, sex, or national origin.

In 1963, the Equal Pay Act of 1963 was passed in order to protect men and women who perform “substantially equal work in the same establishment” from sex-based wage discrimination.

These measures were especially beneficial to women of color. Up until the 1970s, women of color could be openly discriminated against in the hiring process and were often relegated to providing domestic service work to white families

During the 1970s, computing work gained more prestige as the industry realized how valuable computers would become.

woman computer 1970
A woman works at an early model desktop computer made by Servus, circa the 1970s.

It meant women were no longer welcome in many computer programming offices. 

“They weren’t going to put women workers – seen as low-level drones – in charge of computers,” Hicks explains.

According to The Guardian, female computer workers, or “computer girls,” were gradually phased out and replaced with men, who received higher salaries and more prestigious job titles.

By the 1970s, many women were still fighting for better workplace conditions, equal pay, and more job opportunities.

businesswoman
A businesswoman in the 1970s.

From 1972 to 1985, the number of women working “professional” jobs increased from 44% to 49%. The number of women working “management” jobs nearly doubled, rising from 20% to 36%. 

However, in 1970, women still did not earn “equal” wages to men. According to the National Committee on Pay Equity, women earned 59.4% of what men earned.

In the 1970s, education became more important than ever for securing a well-paying job.

nurses doctors
Nurses and doctors at Pelham Bay General Hospital examine a patient’s X-ray in 1975.

After measures were passed that prevented universities and institutions from discriminating against students on the basis of sex, more women were admitted into medical school than in past generations.

Other strides were made for women in the late 1970s. In 1978, the Pregnancy Discrimination Act was passed as an amendment to Title VII of the Civil Rights Act of 1964. This meant that women could start building families without fearing how it would affect their careers.

Women in the workforce in the 1980s continued to make strides, but there was still a ways to go.

female stockbroker
Marilyn Neckes, a former TV producer who changed careers to a stockbroker, in 1984.

According to The Atlantic, in 1985, half of all college graduates were women. However, only 41% of women between the ages of 25 and 44 held full-time year-round jobs.

Even in the mid-1980s, women themselves saw their own careers as inferior to their husbands’. According to The Atlantic, which cited a 1985 Roper survey, only 10% of women said that a husband should turn down a “very good job” in another city “so the wife can continue her job.”

However, women of the 1980s made history in their fields. Dr. Mae Jemison was among 15 new astronauts named by NASA and became the first black female shuttle flyer.

Mae Jemison
Dr. Mae Jemison is among 15 new astronauts in 1987 named by NASA and the first black female shuttle flyer.

In 1984, at the Democratic National Convention held in San Francisco’s Moscone Center, Geraldine Ferraro became the first woman nominated as vice president by a major political party.

Women were encouraged to “do it all” — meaning, hold a successful job as well as maintain a happy and healthy marriage and raise children.

 

The New York Times has referred to the 1990s as the “best era for working women.”

cdc employee 1995
CDC employee working at a computer in 1995.

Computers became more and more prevalent, reducing the need for secretaries, bank tellers, and retail workers. Women overwhelmingly began to be employed in offices and earned higher salaries.

According to Time, women were also postponing marriage and children until later in life.

nasdaq women 90s
Two businesswomen view the latest stock prices on the Nasdaq Wall in 1997.

For most earlier decades, women would be married between the ages of 20 and 22. In 1990, the age jumped to 24, and by 1997, the average age for women to get married was 25.

In 1995, nearly half of all women surveyed reported earning half or more of their total family income. 

In recent years, women held more jobs than men in the US workforce.

1990s woman office
Woman working in an office building in the 1990s.

At the start of 2020, there are now 109,000 more women working than men, and women in the US made up 50.4% of the labor force. 

Sectors that traditionally hire women, like healthcare and education, were growing, and other industries previously dominated by men were also hiring more women than ever before.

According to Forbes, 13.8% of mining and logging jobs were currently held by women, and more women were employed in manufacturing and transportation than in years past as well.

The coronavirus pandemic caused the women’s labor force participation rate to hit a 33-year low in January 2021.

coronavirus pandemic mural
A mural urging people to stay home during the coronavirus pandemic.

According to CNBC, more than 2.3 million women in the US have left the labor force since February 2020, compared to about 1.8 million men who have registered as unemployed. This places the women’s labor force participation rate at 57%, the lowest rate since 1988, according to the National Women’s Law Center.

However, the actual number of women who are currently unemployed may be much higher due to those who may have left the labor force but are not actively looking for work. Instead, many women may be staying home due to mass closures of schools and daycare facilities. 

The data is undeniably dire, despite more jobs being added to the workforce in recent months. In January 2021, 275,000 women left the labor force, accounting for 80% of all unemployed workers over the age of 20 that month.

The situation is even worse for women of color, Insider’s Juliana Kaplan previously reported. According to the NWLC, 8.5% of Black women age 20 and over were unemployed in January 2021, compared to 8.4% in December 2020 and 4.9% in February 2020.

Adversely, the unemployment rate for white men age 20 and over was 5.5% in January 2021, compared to 5.8% in December 2020 and 2.7% in February 2020. 

Read the original article on Business Insider

San Francisco’s school renaming debacle is a timely mix of confused priorities and bad ‘facts’

Lincoln High school San Francisco
A pedestrian walks by a sign outside of Abraham Lincoln High School on December 17, 2020 in San Francisco, California.

  • San Francisco’s school board used inaccurate history to rename schools that haven’t even been open in a year.
  • Paul Revere was deemed irredeemably problematic based on a misreading of a History.com post.
  • Meanwhile, there’s still no date to reopen schools. 
  • This is an opinion column. The thoughts expressed are those of the author.
  • Visit the Business section of Insider for more stories.

San Francisco’s public schools were among the first in the US to shut down at the onset of the coronavirus pandemic in February 2020. 

They’re still closed. And the outrage over the endless foot-dragging on re-opening is well-deserved, especially considering what the city’s school board has spent precious time on rather than laser-focusing on reopening.

Children – especially those in low-income households – continue to suffer mental anguish, the loss of irreplaceable months of youth and social discovery, and a permanent stunting of their education as long as in-person schooling remains unavailable.

And yet for some reason, San Francisco’s Board of Education recently devoted a disproportionate amount of time and energy on an effort to review every single public school in the district with the goal of swiftly renaming any building bearing the name of a person who contributed to the abuse or subjugation of women, minorities, queer people, and the environment.

There’s still no set date to reopen San Francisco’s schools. 

Misplaced priorities and moving goalposts 

In an extraordinary move, City attorney Dennis Herrera filed suit against the San Francisco Unified School District earlier this month, with the support of liberal Mayor London Breed, in an attempt to re-open schools. 

But the same excuses offered by the school board and teachers unions for why schools can’t reopen remain unchanged:

“Teachers’ lives would be at grave risk” is a common argument – even though the CDC has repeatedly stated that schools are among the lowest-risk public places for spreading COVID.

“Schools need revamped ventilation systems” is another – even though the CDC has recommended reopening schools with basic social distancing and ventilation measures (like a fan and an open window) as soon as possible.  

“Teachers need to be vaccinated” is yet another – even though teachers are among the prioritized professions for vaccination in California already

And while California is slowly ramping up its vaccine roll out, the school district and unions could use their resources to help teachers and school employees coordinate COVID vaccination appointments. Thus far, there has been no demonstrable urgency in taking such initiatives. 

But no one can argue the school board hasn’t treated the effort to rename schools with the utmost urgency. 

Originally conceived in 2018 in the wake of the Neo-Nazi violence in Charlottesville, the school re-naming project was kicked into high gear this past summer following the police killing of George Floyd and protests against racism and police brutality.

A schools renaming committee was convened and as can be seen in public video of its deliberations, adherence to historical facts was a secondary concern, and the scope of its own mission seemed to change on the whims of a few of its members. 

Committee members were expected to come to the meeting having already conducted their research, and yet during the meeting members are seen Google-searching for impeachable evidence of reputation-destroying racism or contributions to colonialism. 

And even with such flimsy source material, members sometimes misread the information before them, as demonstrated when a committee member said Paul Revere participated in a conquest of Native American land.

Not only did that not happen, it isn’t even asserted in the History.com “10 Things You May Not Know About Paul Revere” post cited as evidence to justify removing Revere’s name.

Other names deemed worthy of removal included Abraham Lincoln, because despite signing the Emancipation Proclamation his policies were “detrimental” to Native Americans, and Sen. Dianne Feinstein for her support of an urban renewal project that displaced members of a Filipino community while she served as the city’s mayor. 

One committee member noted that former Mayor George Moscone also supported neighborhood-disrupting urban renewal projects, but the school named for the martyred Moscone (who in 1978 was assassinated along with the legendary gay rights activist and city supervisor Harvey Milk) was spared by the committee. 

The mythical city of “El Dorado” – in which a king sprinkled subjects in gold dust – was deemed removable because the Gold Rush led to the death of Native Americans and, as one committee member put it, “I don’t think the concept of greed and lust for gold is a concept we want our children to be given.” 

Another committee member pushed back, arguing that not only is El Dorado not real, it’s not a person, and therefore out of the scope of the group’s stated guidelines. His point of view was rejected out of hand. 

There were several more egregious mistakes, but the San Francisco school board voted 6 to 1 to accept the committee’s recommendations and to begin the process of swiftly renaming 44 schools – including those named for Revere, Lincoln, and Feinstein.

The response was tough but fair. 

A historical embarrassment

An exasperated Mayor Breed said the school board should “bring the same urgency and focus on getting our kids back in the classroom” and only when that’s accomplished should we “have that longer conversation about the future of school names.”

The liberal-leaning San Francisco Chronicle’s editorial board lamented that the board “largely quit the education business and rebranded themselves as amateur historians.”

And in an interview with The New Yorker’s Isaac Chotiner, school board president Gabriela Lopez appeared to defend the committee’s decision to not consult historians who could have easily helped the committee avoid its embarrassing mistakes. Lopez said she didn’t want “get into a process where we then discredit the work that this group has done.”

Following widespread outcry over both the historical misstatements and misprioritization of the issue  – particularly from liberals and Democrats who felt the whole thing made them look like “parodies of ourselves” – the school board this week halted the school renaming process until after schools are reopened

Historians, previously deemed inessential to the process of re-examining historical figures, will be invited into future discussions. 

There is still no anticipated date for San Francisco public schools to reopen, despite private schools and public schools in neighboring counties being opened for months. 

We need schools, and we need facts

It’s tempting to view the San Francisco school renaming debacle through a one-way culture war lens: with woke lefties beclowning themselves and a liberal city’s government unable to provide a basic public function. But that’s reductive. 

If the San Francisco community believes school renaming should be a priority for the district, the board should by all means push forward on those efforts. But it’s tragically comical to focus on renaming schools that have been closed for a year and for the foreseeable future. 

It is a story of misplaced priorities, but it is also indicative of a greater societal problem – which is the conscious choice by many to adopt a Manichean point of view that defines everyone as simply good or simply evil, with facts deemed secondary nuisances.

That’s why the San Francisco debacle matters. Because for citizens of this country to be able to share a reality-based existence, partisans on both sides need to accept that facts matter, political narratives be damned.

Read the original article on Business Insider

I regret to inform you all that history will not save America from itself

capitol riots confederate flag
  • Pundits keep saying that history will repudiate Donald Trump. But that can’t be guaranteed. 
  • American history often leaves out ugly truths and sanitizes the powerful.
  • If we want history to say something, we need to fight for it in the present.
  • This is an opinion column. The thoughts expressed are those of the author.
  • Visit the Business section of Insider for more stories.

I know you’ve been hearing this proclamation on network news and reading it in columns for years.

“History will judge us.” “History will repudiate Donald Trump and the January 6 rioters.” “History will see people like GOP Sen. Mitt Romney as heroes for bucking their own party.” “History will show that the Democrats were people who took a stand for our democracy and our values.”

This sounds good, but there is a danger in the notion that history will reveal the truth of our moment and sort the good from the bad. Past events don’t change, but the telling of history is a conversation that goes on for as long as we exist on this planet. In our own lifetimes Americans have discovered things they’ve forgotten, and rehabilitated individuals in our history who were once maligned.

If we want history to tell the true story of Donald Trump’s violent presidency long after we’re dead, we have to actively, vigilantly reinforce that truth while we’re alive. We cannot guarentee that Americans will get the story right after we’re gone.

A history of holes

The past does not change, but our telling of it does. Americans are famous for concealment by omission. It is only in the last year or two that there has been widespread awareness of the Tulsa Massacre of 1921, for example, when racists destroyed “Black Wall Street” and murdered the people who lived there in a fit of organized rage.

That was only one of our country’s multiple genocides against Black Americans, but we don’t talk about a lot of those. They aren’t pleasant, and they do not fit in with the narrative that America is the longest standing multi-racial democracy in the world.  

Just as it was easier for Americans in the past to forget the importance of the Tulsa Massacre, it could be easier for Americans in the future to forget about the ugliness that led to the January 6 attack on The Capitol. 

It’s also possible that future Americans could manipulate the events around January 6. We already saw that happen immediately after the attack. Some right wing media tried to pin the blame on Antifa and polling indicates that now that what half of Republicans believe. It’s quite possible that future generations could believe that as well. 

We already know that history changes when different people have the power to do the telling. Almost every president worth thinking about has been imagined and reimagined. President Ulysses S. Grant was maligned as a corrupt drunk for decades, in part by Americans who wanted to repudiate the Reconstruction and his support for civil rights for black Americans.

It is only the 21st century that historians have attempted to recover his heroism, not only as a general but also as a president. That’s not because he changed (obviously), but because we did. As our society embraced racial equality, it became clear to historians that our telling of Grant’s presidency was colored by white supremacy. Turns out, he may not even have been an alcoholic, he just liked to drink (Who among us?). 

All of this is to say that we assume history will get things right, when history has actually showed us that it often gets things wrong. It is highly dependent on the people who write it, their power, and how they want us to see ourselves in a great American story. 

See it, be it

The ability of history to be influenced and written in real time is why you can’t have a racist, a demagogue, or authoritarian in the White House – especially not one who knows the power of story as well as Trump. Given the chance to rewrite history these sorts of leaders will take it and distort it with lies.

The Trump administration attempted to do that in ways large and small. It tried to delay Harriet Tubman’s appearance on the $20 bill. That was both a way to conceal the importance of Tubman’s work rescuing slaves and serving in the military as a spy, and a way to preserve the glorification of President Andrew Jackson, a racist.

And then of course there was the “1776 Report” – a shining example of what happens when a young man who spends too much time in racist chatrooms tries to write a history thesis after never having been to class or done the reading. This report, published on the White House website on January 18, was the Trump administration’s attempt at “patriotic education,” a retelling of our history that that minimized the importance and brutality of slavery, and demonized the American left.

Upon taking office, the Biden administration promptly removed it. That’s the kind of vigilance we need to maintain over the telling of what happened on January 6 and for the four years that proceeded it. There are powerful, relentless forces in this country that will wish to conceal it, or distort it to glorification. It is up to us, right now and in the future, to make sure that they do not have their say.

Read the original article on Business Insider

A harrowing photo shows a Trump supporter carrying a Confederate flag inside the US Capitol, flanked by portraits of Civil War figures

Capitol protest
A supporter of President Donald Trump storm the US Capitol Rotunda on January 6, 2021.

  • Photographer Saul Loeb captured an image of a pro-Trump rioter storming the US Capitol on Wednesday.
  • Though the Confederate flag originated during the US Civil War, it never entered the Capitol building during that time.
  • Behind the rioter in the photo, two portraits reflect the fractured nation of the country during the 1860s.
  • To the man’s right is a portrait of Charles Sumner, an abolitionist. To his left is a portrait of Charles Sumner, a defender of slavery.
  • Visit Business Insider’s homepage for more stories.

As rioters stormed the US Capitol on Wednesday, photographer Saul Loeb managed to encapsulate the siege’s dark historical context in a single image. His photo shows a Trump supporter waving a Confederate flag in front of two portraits of Civil War figures in the Capitol Rotunda.

To the man’s right is a portrait of Charles Sumner, a former Massachussetts Senator who protested slavery. To his left is a portrait of John C. Calhoun, the seventh vice president of the United States, who was a staunch defender of slavery and the chief architect of succession during the Civil War.

The proximity of the two portraits calls to mind the fractured nature of US civil society in the 1860s – and the recent cleft that has widened in the lead-up and response to the 2020 election.

“What I find fascinating about that juxtaposition is its connections to violence, because of course [Sumner] was a victim of violence in the Capitol when he was attacked for having had made a speech critical of slavery,” Judith Giesberg, a Civil War historian at Villanova University, told Business Insider. “What that image should remind us of is that there’s a history of having violent political confrontations in Congress.”

Congress met on Wednesday for a joint session to oversee the counting of electoral votes. Around the same time, thousands of Trump supporters gathered in downtown Washington, DC to protest the certification of President-elect Joe Biden’s victory.

Trump urged his supporters to head to the Capitol building, and with Congress in session, rioters stormed the Capitol, forcing the House and Senate to abruptly go into recess. Lawmakers, Hill staffers, and reporters took shelter in their offices before being evacuated. Protesters sat in Vice President Mike Pence’s chair in the Senate chamber, vandalized congressional offices, and looted items like podiums from the building.

Multiple police officers were injured in the violence and evacuated from the Capitol area. A woman was shot in an altercation with law enforcement and later died, MSNBC’s Pete Williams reported. Finally, shortly after 5:30 p.m. local time, the House of Representatives’ Sergeant at Arms announced the Capitol building had been secured.

The photo’s historical significance 

The Confederate flag originated during the Civil War as a battle flag for the pro-slavery Confederacy, but historians say its significance as a political symbol emerged in the 20th century as a sign of resistance to racial integration. During the entire Civil War from 1861 to 1865, the Confederate flag never entered the US Capitol building.

“The flag’s significance during the Civil War has been grossly overstated,” Giesberg said. “We have projected our experiences backward.”

For a plurality of Americans today, the Confederate flag has come to represent racism in general, according to a January 2020 YouGov poll. It’s also a common sight at rallies for President Donald Trump, who has defended people’s decisions to fly the flag in public. 

Giesberg said there’s a deep irony behind the rioter carrying the Confederate flag in front of Sumner’s portrait.

“It’s striking to see [Sumner] juxtaposed with this person who represents what he most was offended by and what he stood against,” she said.

All the more ironic, she added, is the fact that Calhoun’s portrait hangs to his left.

“Calhoun is perfect in this way, in so many ways, because this is a man who was no stranger to treason,” Giesberg said. “He had done more probably than anybody else in the country to rehearse the events that would lead to succession, starting in November of 1860.”

capitol protester
A Trump supporter sits on the second floor of the US Capitol near the entrance to the Senate, beneath a portrait of Charles Sumner.

In July, the House voted to approve legislation to remove statues of Confederate figures, including Calhoun, from the Capitol building.

The decision was in part a continuation of Sumner’s legacy. In 1865, the abolitionist proposed that paintings hanging in the Capitol shouldn’t portray scenes from the Civil War. In particular, he objected to a bust of Chief Justice Roger B. Taney, who ruled in 1857 that African-Americans could not be considered citizens.

Sumner “was certainly a vocal and resolute abolitionist,” Giesberg said. “He was uncompromising in his critique of slavery and for that he paid, ultimately, a very heavy price.”

Read the original article on Business Insider

Here’s a look back on the historic traditions that inspired modern Christmas celebrations, from Ancient Rome to Scandinavia

Christmas holiday
Traditional foods and customs are influenced by European countries.

  • While Christmas is closely associated with the Christian faith and the birth of Jesus, modern-day traditions can also be traced back to rituals and customs from other cultures. 
  • Ancient Rome’s Saturnalia celebrations placed an emphasis on sharing food and drinks, spending time with loved ones, and exchanging little gifts during the winter season. 
  • The Germanic-Scandinavian tradition, Yule, was a winter festival where the Norse god Odin left small gifts for each household on his eight-legged white horse.
  • The evolution of Santa Clause resembles Odin as well as other historical figures, including Saint Nicholas of Myra and the Dutch figure Sinterklaas. 
  • Visit Business Insider’s homepage for more stories.

Not long to go now before many of us get to spread some good tidings and joy as we celebrate Christmas.

The main ways we understand and mark the occasion seem to be rather similar across the world. It’s about time with community, family, food-sharing, gift-giving, and overall merry festivities.

But while Christmas is ostensibly a Christian celebration of the birth of Jesus, many of the rituals and customs come from other traditions, both spiritual and secular.

The first Christmas

The journey of Christmas into the celebration we know and recognize today is not a straight line.

The first Christmas celebrations were recorded in Ancient Rome in the fourth century. Christmas was placed in December, around the time of the northern winter solstice.

It is not difficult to spot the similarities between our now long-standing Christmas traditions and the Roman festival of Saturnalia, which was also celebrated in December and coexisted with Christian belief for a period of time.

Saturnalia placed an emphasis on the sharing of food and drink, and spending time with loved ones as the colder winter period arrived. There is even evidence that the Romans exchanged little gifts of food to mark the occasion.

As Christianity took greater hold in the Roman world and the old polytheistic religion was left behind, we can see the cultural imprint of Saturnalia traditions in the ways in which our well-known Christmas celebrations established themselves across the board.

Read more: Researchers are on a quest to create the perfect Christmas tree that lasts through the holidays – and one of these ‘frankenfirs’ will be available by 2021

A Yule celebration

Turning an eye to the Germanic-Scandinavian context also provides intriguing connections. In the Norse religion, Yule was a winter festival celebrated during the period we now roughly associate with December.

The beginning of Yule was marked by the arrival of the Wild Hunt, a spiritual occurrence when the Norse god Odin would ride across the sky on his eight-legged white horse.

While the hunt was a frightening sight to behold, it also brought excitement for families, and especially children, as Odin was known to leave little gifts at each household as he rode past.

Like the Roman Saturnalia, Yule was a time of drawing in for the winter months, during which copious amounts of food and drink would be consumed.

The Yule festivities included bringing tree branches inside the home and decorating them with food and trinkets, likely opening the way for the Christmas tree as we know it today.

The influence of Yule on the festive season of Northern European countries is still evident in linguistic expression too, with “Jul” being the word for Christmas in Danish and Norwegian. The English language also maintains this connection, by referring to the Christmas period as “Yuletide”.

Here comes Santa

Through the idea of gift-giving, we see the obvious connections between Odin and Santa Claus, even though the latter is somewhat of a popular culture invention, as put forward by the famous poem A Visit from St Nicholas (also known as The Night Before Christmas), attributed to American poet Clement Clarke Moore in 1837 (although debate continues over who actually wrote the poem).

The poem was very well-received and its popularity spread immediately, going well beyond the American context and reaching global fame. The poem gave us much of the staple imagery we associate with Santa today, including the first ever mention of his reindeer.

But even the figure of Santa Claus is evidence of the constant mixture and mingling of traditions, customs, and representations.

Santa’s evolution carries echoes of not only Odin, but also historical figures such as Saint Nicholas of Myra – a fourth-century bishop known for his charitable work – and the legendary Dutch figure of Sinterklaas that derived from it.

Read more: Coaches, founders, and executives share how they’re setting goals for 2021

Christmas down under in the summer

The idea of connecting Christmas to winter festivals and drawing in customs makes the most sense in the colder months of the Northern hemisphere.

In the Southern hemisphere, in countries such as New Zealand and Australia, the traditional Christmas celebrations have evolved into their own specific brand, which is much more suited to the warmer summer months.

Christmas is an imported event in these areas and acts as a constant reminder of the spread of European colonialism in the 18th and 19th centuries.

Celebrating Christmas still carries the influence of European contexts, being a time for merriment, gift-giving, and community spirit.

Even some of the traditional foods of the season here are still indebted to Euro-British traditions, with turkey and ham taking center stage.

All the same, as Christmas falls in the summer down under, there are also different ways to celebrate it in New Zealand and other regions that clearly have nothing to do with winter festivals.

Barbecues and beach days are prominent new traditions, as borrowed practices co-exist with novel ways of adapting the event to a different context.

The wintry Christmas puddings are often exchanged for more summery pavlovas, whose fresh fruit toppings and meringue base certainly befit the warmer season to a greater extent.

The transition to outdoor Christmas celebrations in the Southern hemisphere is obviously locked in common sense because of the warmer weather.

Nonetheless, it also shows how both cultural and geographical drivers can influence the evolution of celebrating important festivals. And if you really want to experience a cold Christmas down under, there is always a mid-year Christmas in July to look forward to.

Lorna Piatti-Farnell, professor of popular culture, Auckland University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Conversation
Read the original article on Business Insider

How the Christmas tree tradition began

  • Christmas trees can be found all over the world today, but medieval Germans were the first to fully embrace the holiday tradition.
  • Even before that, many cultures, including ancient Egyptians, worshipped evergreen trees and branches as a symbol of eternal life.
  • Watch the video above to see how pagans “decked the halls” with evergreens before Queen Victoria made it a popular Christmas decoration. 

Following is a transcript of the video.

Trees have long been used to decorate homes. Ancient Chinese, Hebrews, and Egyptians viewed evergreens as symbols of eternal life. European pagans “decked the halls” with evergreen branches to bring in life during the dark days of winter. But, medieval Germans are credited with starting the Xmas tree tradition. They brought fir trees inside on December 24. The trees were decorated with wafers, candles, and red apples. These “paradise trees” symbolized the Garden of Eden.

17th-century German settlers brought the tradition to North America. The tradition didn’t catch on with most Americans until Queen Victoria popularized it. In 1846, Victoria and her German husband, Prince Albert, put up a Xmas tree. They decorated it with toys, candy, popcorn, and cakes.

Once word spread, the popularity of Christmas trees took off. It became a tradition throughout England and North America. Now, they can be found all over the world.

EDITOR’S NOTE: This video was originally published in November 2018.

Read the original article on Business Insider