If you’re lucky enough to have a job right now, you’re probably doing everything possible to hold onto it. If the boss asks you to work 50 hours, you work 55. If she asks for 60, you give up weeknights and Saturdays, and work 65.
Odds are that you’ve been doing this for months, if not years, probably at the expense of your family life, your exercise routine, your diet, your stress levels, and your sanity. You’re burned out, tired, achy, and utterly forgotten by your spouse, kids and dog. But you push on anyway, because everybody knows that working crazy hours is what it takes to prove that you’re “passionate” and “productive” and “a team player” — the kind of person who might just have a chance to survive the next round of layoffs.
This is what work looks like now. It’s been this way for so long that most American workers don’t realize that for most of the 20th century, the broad consensus among American business leaders was that working people more than 40 hours a week was stupid, wasteful, dangerous, and expensive — and the most telling sign of dangerously incompetent management to boot.
It’s a heresy now (good luck convincing your boss of what I’m about to say), but every hour you work over 40 hours a week is making you less effective and productive over both the short and the long haul. And it may sound weird, but it’s true: the single easiest, fastest thing your company can do to boost its output and profits — starting right now, today — is to get everybody off the 55-hour-a-week treadmill, and back onto a 40-hour footing.
Yes, this flies in the face of everything modern management thinks it knows about work. So we need to understand more. How did we get to the 40-hour week in the first place? How did we lose it? And are there compelling bottom-line business reasons that we should bring it back?
The Making of the 40-Hour Week
The most essential thing to know about the 40-hour work-week is that, while it was the unions that pushed it, business leaders ultimately went along with it because their own data convinced them this was a solid, hard-nosed business decision.
Unions started fighting for the short week in both the UK and US in the early 19th century. By the latter part of the century, it was becoming the norm in an increasing number of industries. And a weird thing happened: over and over — across many business sectors in many countries — business owners discovered that when they gave into the union and cut the hours, their businesses became significantly more productive and profitable. As Tom Walker of the Work Less Institute puts it in his Prosperity Covenant:
That output does not rise or fall in direct proportion to the number of hours worked is a lesson that seemingly has to be relearned each generation. In 1848, the English parliament passed the ten-hours law and total output per-worker, per-day increased. In the 1890s employers experimented widely with the eight hour day and repeatedly found that total output per-worker increased. In the first decades of the 20th century, Frederick W. Taylor, the originator of “scientific management” prescribed reduced work times and attained remarkable increases in per-worker output.
By 1914, emboldened by a dozen years of in-house research, Henry Ford famously took the radical step of doubling his workers’ pay, and cut shifts in Ford plants from nine hours to eight. The National Association of Manufacturers criticized him bitterly for this — though many of his competitors climbed on board in the next few years when they saw how Ford’s business boomed as a result. In 1937, the 40-hour week was enshrined nationwide as part of the New Deal. By that point, there were a solid five decades of industrial research that proved, beyond a doubt, that if you wanted to keep your workers bright, healthy, productive, safe, and efficient over a sustained stretch of time, you kept them to no more than 40 hours a week and eight hours a day.
Evan Robinson, a software engineer with a long interest in programmer productivity (full disclosure: our shared last name is not a coincidence) summarized this history in a white paper he wrote for the International Game Developers’ Association in 2005. The original paper contains a wealth of links to studies conducted by businesses, universities, industry associations, and the military that supported early-20th-century leaders as they embraced the short week. “Throughout the ’30s, ’40s, and ’50s, these studies were apparently conducted by the hundreds,” writes Robinson; “and by the 1960s, the benefits of the 40-hour week were accepted almost beyond question in corporate America. In 1962, the Chamber of Commerce even published a pamphlet extolling the productivity gains of reduced hours.”
What these studies showed, over and over, was that industrial workers have eight good, reliable hours a day in them. On average, you get no more widgets out of a 10-hour day than you do out of an eight-hour day. Likewise, the overall output for the work week will be exactly the same at the end of six days as it would be after five days. So paying hourly workers to stick around once they’ve put in their weekly 40 is basically nothing more than a stupid and abusive way to burn up profits. Let ‘em go home, rest up and come back on Monday. It’s better for everybody.
As time went on and the unions made disability compensation and workplace safety into bigger and bigger issues, another set of concerns further buttressed the wisdom of the short week. A growing mountain of data was showing that catastrophic accidents — the kind that disable workers, damage capital equipment, shut down the lines, open the company to lawsuits, and upset shareholders — were far more likely to occur when workers were working overtime and overtired.
That sealed the deal: for most businesses, the potential human, capital, legal, and financial risks of going over 40 hours a week simply weren’t worth taking. By World War II, the consensus was clear and widespread: even (or especially!) under the extreme demands of wartime, overworking employees is counterproductive and dangerous, and no competent workplace should ever attempt to push its people beyond that limit.
The Overtime Exception
There was one exception to this rule. Research by the Business Roundtable in the 1980s found that you could get short-term gains by going to 60- or 70-hour weeks very briefly — for example, pushing extra hard for a few weeks to meet a critical production deadline. However, there were a few serious caveats attached to this which used to be well-known, but have mostly been forgotten.
One is that increasing a team’s hours in the office by 50 percent (from 40 to 60 hours) does not result in 50 percent more output (as Henry Ford could have told them). Most modern-day managers assume there will be a direct one-to-one correlation between extra hours and extra output, but they’re almost always wrong about this. In fact, the numbers may typically be something closer to 25-30 percent more work in 50 percent more time.
Here’s why. By the eighth hour of the day, people’s best work is usually already behind them (typically turned in between hours 2 and 6). In Hour 9, as fatigue sets in, they’re only going to deliver a fraction of their usual capacity. And with every extra hour beyond that, the workers’ productivity level continues to drop, until at around 10 or 12 hours they hit full exhaustion.
Another is that overtime is only effective over very short sprints. This is because (as Sidney Chapman showed in 1909) daily productivity starts falling off in the second week, and declines rapidly with every successive week as burnout sets in. Without adequate rest, recreation, nutrition, and time off to just be, people get dull and stupid. They can’t focus. They spend more time answering e-mail and goofing off than they do working. They make mistakes that they’d never make if they were rested; and fixing those mistakes takes longer because they’re fried. Robinson writes that he’s seen overworked software teams descend into a negative-progress mode, where they are actually losing ground week over week because they’re so mentally exhausted that they’re making more errors than they can fix.
The Business Roundtable study found that after just eight 60-hour weeks, the fall-off in productivity is so marked that the average team would have actually gotten just as much done and been better off if they’d just stuck to a 40-hour week all along. And at 70- or 80-hour weeks, the fall-off happens even faster: at 80 hours, the break-even point is reached in just three weeks.
And finally: these death marches take a longer-term productivity toll as well. Once the crisis has passed and that 60-hour-a-week team gets to go back to its regular 40, it can take several more weeks before the burnout begins to lift enough for them to resume their typical productivity level. So, for a while, you’ll get significantly less than a full 40 out of them.
Wise managers who understand this will a) avoid requiring overtime crunches, because they’re acutely aware of the serious longer-term productivity hit that inevitably follows; b) keep the crunches as short as possible when they are necessary; and c) give their teams a few days off — one to two comp days per overtime week worked is about right — at the end of a hard sprint. This downtime enables them recuperate more quickly and completely. It’s much more productive to have them gone for the next week — and then back on the job, rested and ready to work — than have them at their workstations but too fried to get anything useful done for the next month.
So, to summarize: Adding more hours to the workday does not correlate one-to-one with higher productivity. Working overtime is unsustainable in anything but the very short term. And working a lot of overtime creates a level of burnout that sets in far sooner, is far more acute, and requires much more to fix than most bosses or workers think it does. The research proves that anything more than a very few weeks of this does more harm than good.
Enter the Knowledge Worker
After WWII, as the GI Bill sent more workers into white-collar jobs, employers at first assumed that the limits that applied to industrial workers probably didn’t apply to knowledge workers. Everybody knew that eight hours a day was pretty much the limit for a guy swinging a hammer or a shovel; but those grey-flannel guys are just sitting at desks. We’re paying them more; shouldn’t we be able to ask more of them?
The short answer is: no. In fact, research shows that knowledge workers actually have fewer good hours in a day than manual laborers do — on average, about six hours, as opposed to eight. It sounds strange, but if you’re a knowledge worker, the truth of this may become clear if you think about your own typical work day. Odds are good that you probably turn out five or six good, productive hours of hard mental work; and then spend the other two or three hours on the job in meetings, answering e-mail, making phone calls, and so on. You can stay longer if your boss asks; but after six hours, all he’s really got left is a butt in a chair. Your brain has already clocked out and gone home.
The other thing about knowledge workers is that they’re exquisitely sensitive to even minor sleep loss. Research by the US military has shown that losing just one hour of sleep per night for a week will cause a level of cognitive degradation equivalent to a .10 blood alcohol level. Worse: most people who’ve fallen into this state typically have no idea of just how impaired they are. It’s only when you look at the dramatically lower quality of their output that it shows up. Robinson writes: “If they came to work that drunk, we’d fire them — we’d rightly see them as a manifest risk to our enterprise, our data, our capital equipment, us, and themselves. But we don’t think twice about making an equivalent level of sleep deprivation a condition of continued employment.”
And the potential for catastrophic failure can be every bit as high for knowledge workers as it is for laborers. Robinson cites the follow-up investigations on the Exxon Valdez disaster and the Challenger explosion. Both sets of investigators found that severely overworked, overtired decision-makers played significant roles in bringing about these disasters. There’s also a huge body of research on life-threatening errors made by exhausted medical residents, as well as research by the US military on the catastrophic effects of fatigue on the target discrimination abilities of artillery operators. (As Robinson dryly notes: “It’s a good thing knowledge workers rarely have to worry about friendly fire.”)
“Passion,” De-Unionization, and the End of the 40-Hour Week
How did this knowledge, which was so deeply embedded in three generations of American business management that it was utterly taken for granted, come to be so lost to us now? There are probably several answers to that, but there are three factors in particular that stand out.
The first is the emergence of Silicon Valley as an economic powerhouse in the late 1970s. Since WWII, the valley had attracted a unique breed of worker — scientists and technologists who carried with them a singular passion for research and innovation. Asperger’s Syndrome wasn’t named and identified until 1994, but by the 1950s, the defense industries in California’s Santa Clara Valley were already drawing in brilliant young men and women who fit the profile: single-minded, socially awkward, emotionally detached, and blessed (or cursed) with a singular, unique, laser-like focus on some particular area of obsessive interest. For these people, work wasn’t just work; it was their life’s passion, and they devoted every waking hour to it, usually to the exclusion of non-work relationships, exercise, sleep, food, and sometimes even personal care. The popular stereotype of the geek was born in some real truths about the specific kinds of people who were drawn to tech in those early years.
The culture that grew up in the valley over the next few decades reflected and valorized the peculiarities of what Lockheed’s company psychologists were calling by the late ’50s “the sci-tech personality.” Companies broadened their working hours, so programmers who came in at noon and worked through till midnight could make their own schedules. Dress codes were loosened; personal eccentricities were celebrated. HP famously brought in breakfast every morning so its engineers would remember to eat. The local 24-hour supermarket carried microchips alongside the potato chips, so techies working in their garages could stop in at 2am for snacks and parts.
And then, in the early ‘80s, Tom Peters came along, and promoted the Silicon Valley work ethic to the rest of the country in the name of “excellence.” He extolled tech giants like HP and Apple for the “passion” of their workers, and told old-industry employers that they could move into the new age by seeking out and rewarding that kind of passion in their employees, too. Though Peters didn’t advocate this explicitly, it was implicitly understood that to “passionate” people, 40-hour weeks were old-fashioned and boring. In the new workplace, people would find their ultimate meaning and happiness in the sheer unrivaled joy of work. They wouldn’t want to be anywhere else.
There were two problems with this. The first is that this “passion” ideal didn’t recognize that the vast majority of people have legitimate physical, emotional and psychological needs — things like sleep, exercise, relaxation, and the maintenance of strong family and social support bonds — that these engineers didn’t have to nearly the same degree. The second was that most managers, lacking windows into their workers’ souls, decided to cut corners and measure passion with one easy-to-chart metric: “willingness to spend your entire life at the office.” (It was about this time, with gourmet company cafeterias and in-house fitness centers and on-site child care sprouting up in high-tech campuses all over town, that I realized if a company is working that hard to make the workplace feel like home, it’s a strong suggestion that their employees risk sanction if they ever attempt to visit their actual homes again.)
These were the early morning-in-America Reagan years. The unions — for 150 years, the guardians of the 40-hour week — were falling under a conservative onslaught; and in their place, the new cult of the entrepreneur was ascendant. All the old paternalistic contracts between employers and employees were torn up. Where companies once hoped to hire people young and nurture their careers through to a pensioned retirement — a lifelong relationship that required managers to take the long view about how to keep their workforces sustainably healthy and happy — young Gen Xers were being given a 401k and told to expect to change jobs every three to five years. Even while employers were demanding new levels of “passion” and commitment, they were also abdicating their old obligation to look after the long-term well-being of their employees.
The rapacious new corporate ethic was summarized by two phrases: “churn ‘em and burn ‘em” (a term that described Microsoft’s habit of hiring young programmers fresh out of school and working them 70 hours a week until they dropped, and then firing them and hiring more), and “working 90 hours a week and loving it!” (an actual T-shirt worn with pride by the original Macintosh team. Productivity experts estimate that we’d have probably had the Mac a year sooner if they’d worked half as many hours per week instead.) And this mentality soon spread from the technology sector to every industry in every corner of the country.
The new ideal was to unleash “internal entrepreneurs” — Randian übermenschen who would devote all their energies to the corporation’s success, in expectation of great reward — and who were willing to assume all the risks themselves. In this brave new world, the real go-getters were the ones who were willing to put in weekends and Saturdays, who put their families on hold, who ate at their desks and slept in their cubicles. Forty-hour weeks were for losers and slackers, who began to vanish from America’s business landscape. And with their passing, we all but forgot all the very good reasons that we used to have those limits.
Within 15 years, everything America’s managers used to know about sustaining worker productivity was forgotten. Now, 30 years and a few economic meltdowns on, the cafeterias and child-care centers and gyms are mostly gone, along with the stock options and bonuses that were once held out as the potential reward for the long hours. All that remains of those heady, optimistic days is the mandatory 60-hour work-week. And, unless you’re an hourly worker — still entitled to time and a half by law — the only inducement employers currently offer in exchange for submitting yourself to this abuse is that you get to keep your job.
Can We Bring It Back?
Bringing back the 40-hour work-week is going to require a wholesale change of attitude on the part of both employees and employers.
For employees, the fundamental realization is that an employer who asks for more than eight hours a day or 40 hours a week is stealing something vital and precious from you. Every extra hour at work is going to cost you, big time, in some other critical area of your life. How will you make up the lost time? Will you ditch dinner and grab some fast food? Skip the workout? Miss the kids’ game this week? Sleep less? (Sex? What’s that?) And how many consecutive days can you keep making that trade-off before you are weakened in some permanent and substantial way? (Probably not as many as you think.) Changing this situation starts with the knowledge that an hour of overtime is a very real, material taking from our long-term well-being — and salaried workers aren’t even compensated for it.
There are now whole industries and entire branches of medicine devoted to handling workplace stress, but the bottom line is that people who have enough time to eat, sleep, play a little, exercise, and maintain their relationships don’t have much need of their help. The original short-work movement in 19th-century Britain demanded “eight for work, eight for sleep, and eight for what we will.” It’s still a formula that works.
For employers, the shift will be much harder, because it will require a wholesale change in some of the most basic assumptions of our business culture. Two generations of managers have now come of age believing that a “good manager” is one who can keep those butts in those chairs for as many hours as possible. This assumption is implicit in how important words like “productivity” and “motivation” are defined in today’s workplaces. A manager who can get the same amount of work out of people in fewer hours isn’t rewarded for her manifest skill at bringing out the best in people. Rather, she’s assumed to be underworking her team, who could clearly do even more if she’d simply demand more hours from them. If the crew is working 40 hours a week, she’ll be told to up it to 50. If they’re already at 50, management will want to get them in on nights and weekends, and turn it into 60. And if she balks — knowing that actual productivity will suffer if she complies — she won’t get promoted.
Of course, hiring new people is out of the question — again, especially when the workers are salaried. Squeezing extra time out of an employee when you’re not going to have to pay extra for it is seen as a total freebie by managers who cling to the delusion that they’re getting 50 percent more work in 50 percent more time. This belief also drives the fallacy that you can fire one person and divide their job between two other people, who will work an extra 20 hours per week for free — and that there is no possible downside to the company for doing this.
And of, course, that’s wrong.
And it hurts the country, too. For every four Americans working a 50-hour week, every week, there’s one American who should have a full-time job, but doesn’t. Our rampant unemployment problem would vanish overnight if we simply worked the way we’re supposed to by law.
We will not turn this situation around until we do what our 19th-century ancestors did: confront our bosses, present them with the data, and make them understand that what they are doing amounts to employee abuse — and that abuse is based on assumptions that are directly costing them untold potential profits. We may have to appeal to the shareholders, whose investments are at serious risk when employees are overworked. (At least one shareholder suit has already been filed against a computer game company that was notorious for working its people 80 hours a week for years on end. It was settled out of court on terms favorable to the plaintiffs.) We may have to get harder-nosed in negotiating with our bosses when we first take the jobs, and get our hours in writing up front — and then demanding that they stick with the contract down the line. And we also need to lean on our legislators to start enforcing the labor laws on the books.
But the bottom line is: For the good of our bodies, our families, our communities, the profitability of American companies, and the future of the country, this insanity has to stop. Working long days and weeks has been incontrovertibly proven to be the stupidest, most expensive way there is to get work done. Our bosses are depleting resources from of the human capital pool without replenishing them. They are taking time, energy, and resources that rightfully belong to us, and are part of our national common wealth.
If we’re going to talk about creating a more sustainable world, let’s start by talking about how to live low-stress, balanced work lives that leave us refreshed, strong and able to carry on as economic contributors for a full four or five decades, instead of burned out and broken by a too-early middle age. A full, productive 40-year career starts with full, productive 40-hour weeks. And nobody should be able to take that away from us, not even for the sake of a paycheck.
Sara Robinson is a Seattle-based cultural theorist and futurist with expertise in change resistance and authoritarian movements. She is on the board of the NARAL Pro-Choice America Foundation, and works with a variety of progressive organizations as a speaker, facilitator, and consultant.