Musings From a Coffee Shop

Derrick Rossi was born in 1966 in Toronto. He is the youngest child of Maltese immigrants who did not have a college education but worked hard to raise their five children. Rossi’s father worked in an auto body shop for fifty years and his mother was part owner in a Maltese bakery. Their son went on to develop an interest in molecular biology and made early discoveries in the field of messenger RNA. He founded ModeRNA Therapeutics in 2010 to commercialize his research.

Rossi never envisioned that his work would lead to vaccine development and he is no longer involved in the company. However, there’s no doubt that his early discoveries played a key role in making Moderna’s COVID-19 vaccine possible. Clinical trials indicate that the Moderna vaccine is 94.1% effective at preventing COVID-19 illness in people who received two doses and had no evidence of being previously infected. The FDA issued an emergency use authorization for the Moderna vaccine on December 18, 2020. The Pfizer-BioNTech vaccine, also utilizing mRNA technology, received an emergency use authorization on December 11, 2020.

Vaccine development was always a process measured in years, not months or weeks, and the most optimistic estimates early in the pandemic was that a vaccine may require a year to eighteen months to develop. But in a matter of weeks, Moderna had developed a vaccine and it rapidly entered clinical trials. The timeline is simply remarkable.

The federal government anticipates that all adults in the United States will be able to make appointments for vaccination by April 19, 2021, just two weeks from now. Both the Moderna and Pfizer vaccines require two doses and not everyone will be able to get immediate appointments. However, it is likely that everyone who wants a vaccine should have an opportunity to be fully vaccinated by the end of June.

One week after receiving the second Moderna vaccine shot, I am able to sit inside a coffee shop and write this article. That might not seem like a big deal, but it has been over a year since I last did something this mundane. And the mundane now seems like a milestone.

Slowly, but surely, the economy is reawakening from a long slumber. Over the weekend, outdoor dining was packed and more restaurants are opening up inside seating as well. Bolstered by stimulus payments, pent-up demand, and beautiful spring weather, people want to get back to their lives, and that’s a great thing to see.

Exactly a year ago, I observed that the world has no pause button, and it’s a useful personal exercise to reflect on how I thought the pandemic would evolve compared to what actually happened:

The speed and shape of the recovery is on everyone’s minds at this point, and much will depend on Keynes’s animal spirits. Will the boarded up businesses scattered through countless American cities open up again when governments give the all-clear to do so? Will the customers of these businesses be willing to again go out and spend money in person after weeks or months of self-isolation?

Much will depend on whether the coronavirus pandemic is viewed as a one-time event or as a potentially recurring feature of our lives going forward. If the pandemic is viewed as a horrible, but temporary, interlude in an era of prosperity, then the government’s efforts to induce a “medical coma” of the hardest hit sectors of the economy could well succeed. Sound businesses will emerge with muscle atrophy. Those that were weak even before the crisis may never reopen at all. But the overall system will rebound and regroup.

In early April 2020, few people believed that the lockdowns and related restrictions would last a year rather than weeks or months. I recall watching small businesses close, following the talk on Facebook and elsewhere, and the sentiment seemed to be that things would be back to normal by the summer, at the latest. The first stimulus bill had passed and the hope was that this temporary palliative would be able to staunch the bleeding for the time required to “stop the spread”. But the situation did not abate and the country endured month after month of constrained economic activity. The federal government responded with two additional stimulus bills funded by issuing trillions of dollars of new debt.

At the start of the pandemic, financial markets clearly did not anticipate that the federal government would step in as vigorously. Yet here we are in early April 2021 with stock markets at record highs. Who would have believed that in March 2020 when I was writing about how to cope with the massive market meltdown that was underway?

I was wrong about many aspects of the pandemic, but I was correct to not panic when stocks crashed. I don’t know how other investors handled that crash mentally, but I anchored on the actual businesses that are represented by the ticker symbols I own. For example, I took the time to examine Berkshire Hathaway as a business and tried to understand how the shutdowns would affect the company’s subsidiaries. I did the same for other companies that I own.

Make no mistake about it, I understood that, with few exceptions, all owners of American businesses were poorer due to the pandemic. It would be delusional to think that my portfolio had not declined in intrinsic value terms. The question was whether the market was appraising the situation accurately or acting with emotion. As usual, stock market participants reacted more emotionally than an owner of a privately held business that has no market quote. In panics, quotes are an emotional burden for those who do not understand the intrinsic value of what they own.

Conviction has to be built up during ordinary times if you want to fortify yourself mentally for the tough times. A market crash is not the time to begin to study the intrinsic value of your holdings. I made one major portfolio change during the pandemic, not because the price of the stock in question had declined but because my conviction in the business declined. Of course, I had studied the company in depth prior to owning it. But in retrospect, I did not have enough conviction to own that business at all. This lack of conviction did not manifest until tested by a crisis.

So where do we go from here? If we assume that all adults in the United States who want a vaccine can be fully vaccinated by the summer, will things go back to normal at that time?

I suspect that we will have several more months of restrictions before we are truly back to normal, but the reality is that no one can really predict the trajectory of the rest of the year. I’m not going to post any polls on vaccine acceptance because they seem to be changing all the time, but it is quite clear that a significant percentage of Americans view the vaccines with suspicion and may not be willing to get the shots. In May 2020, I wrote about the politicization of masks and the discourse regarding vaccines seems to be developing in the same way. Talk of “vaccine passports” and coercive measures to obtain compliance are likely to backfire.

Society has an interest in maximizing the percentage of the population accepting the vaccines because we want to reach “herd immunity” — the point at which the COVID virus will not find enough susceptible people to remain a threat to the population at large. Convincing as many people as possible to accept vaccinations will reduce the time required to reach herd immunity. However, in a free society, government should not compel people to accept vaccination through coercive methods.

A key question remains whether unvaccinated people represent a direct threat to vaccinated people. The CDC’s latest recommendations indicate that vaccinated people should continue taking precautions but can gather in small groups with other vaccinated people as well as with unvaccinated people who are not at risk of severe illness from COVID:

You can gather indoors with fully vaccinated people without wearing a mask or staying 6 feet apart. You can gather indoors with unvaccinated people of any age from one other household (for example, visiting with relatives who all live together) without masks or staying 6 feet apart, unless any of those people or anyone they live with has an increased risk for severe illness from COVID-19.

When you’ve been fully vaccinated – retrieved on April 6, 2021

Will unvaccinated people pose any risk to vaccinated people? If not, should vaccine passports be required? Should masks be required after everyone who wants to be vaccinated has been able to get the vaccine?

These are the key questions that should guide policy in the months ahead. If the vaccine is available to everyone who wants it, we should be in a position to return to normal. It is reasonable for people to expect society to take precautions to protect them if there is no vaccine available, but not reasonable to expect continued restrictions if a vaccine is an option.

As I look back over the past year, I am amazed by the advances in science that led to vaccine development in record time. But I am dismayed by the politicization of the pandemic and the great divide between Americans. When political party affiliation is so tightly correlated with issues such as masks and vaccines, something has gone terribly wrong in our national discourse.

Politicians need to focus on getting society back to normal as soon as possible and resist the temptation to not let a crisis “go to waste”. The pandemic has cast a spotlight on many longstanding problems in America, but longstanding problems deserve reasoned debate and deliberation outside the context of an emergency.

It’s hard to be optimistic about politics, but right now it is hard to be a total pessimist on a nice spring day as I sit inside a coffee shop for the first time in thirteen months.

Goldman’s Infamous Boot Camp

Working from home has traditionally been viewed as a benefit for employees who are looking for a better work-life balance. However, this has not been the case for junior employees of Goldman Sachs who are pleading for the luxury of working only eighty hour weeks. The pandemic triggered an exodus from office buildings but it brought no relief from the relentless hours. In fact, the lack of definition between home and work life made the situation even worse!

Goldman employees are hardly a sympathetic group. After all, the carrot at the end of the very long stick is a lucrative career that promises financial freedom at a relatively young age and everyone knows that investment banking is notorious for long hours.

They signed up for this.

This is the traditional view of long hours, especially on Wall Street. Why do the bosses expect and demand it? Mostly because they went through the same thing as junior employees, and by God, so will the new generation of underlings! Traditionally, this has meant long hours at the office with plenty of face time with the boss, but in a pandemic era of working from home, the same expectations exist for junior employees to be available on a 24/7 basis remotely.

So much for a work-life balance. You can sleep when you’re dead.

Given the obvious dysfunction of expecting people to be productive for sixteen to twenty hours a day, it is tempting to dismiss this practice as self-evidently stupid but any tradition that has been in existence for decades should be more carefully examined. Are there any benefits?

The military is famous for putting new recruits through hell in the form of boot camps. There is no private time, no individualism, and the idea of a work-life balance is completely foreign. The goal is to indoctrinate recruits into the military way of life and, more importantly, to establish a set of shared experiences that will build unit cohesion in the long run. The United States Marine Corps has a motto of “Earned, Never Given”. Those who finish boot camp have earned their place in the Marines and know that their fellow soldiers have also earned it.

But Goldman Sachs isn’t the United States Marine Corps, and the ultimate goal is making money, not serving and protecting the country.

This is not to say that investment banking does not serve society — it does, not only in theory but in practice, by helping to facilitate the allocation of capital that is vital in a market economy. There is obviously a sense that “recruits” must prove themselves to be totally dedicated to the firm and must pay their dues in order to earn their place at higher levels. And you can bet that these new recruits in 2021 will remember their experiences and impose the same thing on their underlings in the 2030s and 2040s.

But is any of this useful or productive?

Other than a brief internship in a brokerage firm in college, I have no professional exposure to the finance industry. But I did have extensive experience in the startup culture of Silicon Valley during the 1990s and 2000s. The culture of Silicon Valley during that timeframe had much in common with Wall Street, at least in terms of working hours.

In technology startup culture, working insane hours was the norm, not the exception, and eighty hour weeks were common. this typically took the form of 12-14 hour days, six days per week, with plenty of cases where even more was necessary. Or at least we thought it was necessary.

What happens when you put a young software engineer in front of a screen for twenty hours a day? At first, you get a lot of productivity, or at least what you think is productivity. You get many lines of code written in a frenzy and most of it even seems to work, at least most of the time. But 10 pm comes along and that additional feature still isn’t working quite right, so the pizza arrives and soon it is 2 am, then 4 am, and you’ve pulled an all-nighter, but now at least the feature you were working on works, at least it seems to.

In a start-up culture, there is pressure to write code in order to quickly show results and ship products, but what you quickly discover is that there is a point where additional hours stop producing incremental benefits and start to create incremental harm.

I cannot count the number of cases where bugs in source code that were introduced during all-night coding sessions ended up costing enormous amounts of time and money to rectify later. Lack of sleep and the desire to just get … it … done … is not conducive to quality work, at least not for most people.

There are exceptions, but most human beings cannot function well for very long in a culture like this. You might be able to bootstrap a startup working insane hours for a period of time but this is not a good way to build a team that will stay together for many years. From what I have read recently, there are some signs that the larger technology firms have begun to recognize the self-defeating nature of a culture that requires crazy hours, but I am sure that startup culture has not changed very much.

Looking back at the years of sixty and eighty hour weeks, it is clear that it was hardly worthwhile. Not only because it made a decent work-life balance completely impossible but because it was often counter-productive and useless. Having been through this culture myself, I expected it from employees who reported to me and I thought that continuing to work long hours myself would set the right example. Instead, it probably sent the message that long hours would never relent and that if you want a better work-life balance, you should look elsewhere.

“I did it, so you have to do it” is not exactly an intelligent way to structure a team, but it seems to be the way things are still done in many industries today.

The Price of Misery

Twitter allows users to run polls. Anyone can respond. In practice, most of the responses will come from your followers, and the type of followers you have will obviously impact the results. It is quite clearly not a random sample of the general population, but instead a sample of followers at a moment in time. A respondent’s answer is not publicly visible which fosters honesty. It takes a second to vote in a poll and what you’re capturing is a gut reaction rather than much considered thought. And that gut reaction depends on the life experiences of the person who is responding.

Over the weekend, I ran the following poll which had 1,210 responses:

There is not much room for nuance in a 280 character tweet. The wording probably encouraged people to answer based on their own personal situation and the cost of their lifestyle. Given that most of my followers are in the “fintwit” community, we are talking about a relatively affluent group.

The point of the survey was to explore the trade-offs between money and happiness. If you are spending two thousand hours per year doing your job, that is nearly 23 percent of the total number of hours in a year, and an even greater percentage of your waking hours.

What is the price you’re willing to pay to be happy during those working hours?

Ignoring tax considerations and looking just at gross income, the incremental cost of happiness over boredom is $37.50 per hour. The cost of boredom over misery is $75 per hour.

I chose $75,000 per year as the baseline job because it exceeds the median family income in the United States. Although it might seem like a small sum to many people, how many Americans would jump at a chance to work at a job that they love that pays more than the median family income?

It depends on your stage in life and the cost of your current lifestyle. If you are 22 years old and just starting out, you will answer differently than if you are 50 years old with a mortgage and three children who about to enroll in college. As we get older and accumulate more fixed expenses, the cumulative effect of our lifestyle decisions ossify.

Some younger people might prefer the $300,000 job because they figure that they can work for a few years in a job they hate and save money before switching to the $75,000 job that they will love.

There are at least two problems with this idea: First, you’re accepting misery early in your career in exchange for money which will breed all kinds of negative attitudes toward work and you will not gain any traction in the field that you eventually want to enter. Second, your lifestyle is likely to ratchet up to consume most of the $300,000 income. Why? Because your friends and coworkers are likely to also be affluent and you will rationalize that you should “treat yourself” because, after all, you deserve it. You hate your job so you want to enjoy your free time.

Soon enough, you’re trapped because your lifestyle will no longer permit switching to the career you love.

Financial security is extremely important but it has more to do with your lifestyle decisions early in life than maximizing income. Trading misery for money seems like a terrible way to get ahead because the temporary has a way of becoming permanent.

Of course, the ultimate solution is to work in a career you love that also pays extremely well — but the best of both worlds is not always possible.

As Seneca said, “Life is long if you know how to use it.”

What We Have Lost

“Let’s zoom really soon!”

So said the older man loudly into his cell phone at the conclusion of a call that everyone in the waiting room could overhear. He was using zoom as a verb, just as we might talk about googling something on the internet or xeroxing a document. Before the pandemic, videoconferencing was mostly restricted to dreary business meetings in which people hundreds or thousands of miles apart could speak over each other in what everyone realized was a poor substitute for being in the same room. But the pandemic made zoom into a verb and people started using it to keep in touch with family as well as colleagues.

Better than nothing, right?

It certainly is better than nothing, for now.

We have Zoom, FaceTime, and many other ways to interact with family and friends who we haven’t been able to see in person over the past year and the fact that many businesses could operate remotely softened the economic blow of the pandemic.

Like everyone else, I have used Zoom and FaceTime more over the past year than ever before. I used it to communicate with family, attend virtual conferences, and even to take music lessons. The technology has advanced greatly over the past decade and increased bandwidth allows for less latency in video and audio. But it is still hardly the same — try talking to smaller children over FaceTime or dealing with the latency issues when playing musical instruments. It is still far inferior to being in the same room.

The benefits of being in the same physical location used to be taken for granted. This is obviously true for friends and family but also in professional settings. The camaraderie of working alongside colleagues with shared objectives and goals is an important part of a cohesive workplace. So are the serendipitous encounters you might have with people in an office setting that just aren’t going to happen over Zoom.

There’s also a huge difference between maintaining existing relationships over videoconferencing and establishing new ones. You can take a team that has functioned well for many years and continue for a period of time working remotely. But what happens when you bring new people on board? You lose the shared set of experiences and sense of purpose that can only come from real life interaction. It is by no means impossible to onboard new people remotely, but it is certainly harder.

Understandably, many workers who had terrible commutes or were living in tiny apartments just to be close to work might relish the thought of permanently working from home. Many people have even abandoned cities for the suburbs or smaller cities during the pandemic thinking that they could continue working remotely for their current employer permanently. There is no doubt that some people will find that a better work-life balance can be struck away from the office environment, but plenty of people are going to find that they will actually work harder than ever before because there is no longer a clear physical separation between their work and home lives. And too much togetherness can be a problem in some families.

In the late 1980s, Ray Oldenburg came up with the term third place which refers to social environments separate from one’s home and work life. A third place might be a setting such as a bar, coffee house, library, church, bowling alley, music venue, or a bookstore. Oldenburg believed that having third places is an important element for human beings to fully engage in civic society. Think of your pre-pandemic life and chances are that you had one or more third places that you visited regularly. I know that I did.

For many people, the pandemic robbed us not only of those third places but also of the second place, our workplace, leaving us with the home as the only environment.

I believe that this is very dysfunctional and problematic in the long run.

But I admit to being biased.

I like urban areas, the more urban the better, and have always viewed the bustle of cities as a representation of economic activity and human achievement. I won’t go so far as to say that I enjoyed packed subway cars, but I find the deserted subways of the pandemic even more disturbing.

The talk of the decline of cities after the pandemic strikes me as a terrible development for humanity and I hope that people will again return to offices. But this might be a minority opinion. The conventional wisdom certainly seems to favor a permanent shift away from cities and offices — a secular trend that might have many unintended consequences.

The Use of Letters

Great teachers are very rare. The best in the profession have an infectious enthusiasm for their subject and view their work as a higher calling rather than just a way to earn a paycheck. I was lucky enough to have great teachers in business which sparked my interest in finance and investing. But I was not so lucky in subjects such as history and literature. I did what was needed to scrape by in high school and in college I became an expert at acing tests, but going through the motions doesn’t result in any lasting benefit.

I still remember my first Berkshire Hathaway annual meeting in 2000. I kept wondering about “the other guy” up there on the stage with Warren Buffett. His wisdom was obvious from the brief comments he made. Buffett and Munger are both great teachers, but Munger’s interests are broader and extend way beyond the business world. I am not sure where I first read the following Munger quote but it made quite an impression:

“In my whole life, I have known no wise people (over a broad subject matter area) who didn’t read all the time — none, zero. You’d be amazed at how much Warren reads–and at how much I read. My children laugh at me. They think I’m a book with a couple of legs sticking out.”

Charlie Munger

The point is pretty simple. If you want to be wise, you must read. And you must read widely. And you must read constantly.

No excuses.

Why is this the case?

It is not complicated: Human beings live a finite life and we are products of the times in which we live. If our entire view of the world is based only on what we see and observe during our life, we have failed to understand and benefit from all of the life experiences of the people who came before us.

What could be more breathtakingly stupid than to insist on relearning all of the hard lessons prior generations had to figure out for themselves?

During my formal education, I was in a hurry and I only cared about really learning subjects that were in some way related to business and investing. History, literature, and the humanities flew by in a blur as I did what was needed to ace tests and promptly erased what I learned from memory after finals.

This experience was the polar opposite of the St. John’s College great books program. But it is never too late to begin and in recent years I have spent an increasing amount of time focusing on reading.

The Rise and Fall of the Roman Empire by Edward Gibbon is, of course, one of the great books, and I have been working my way through this six volume classic. When I came across the following passage, I immediately thought of Charlie Munger:

The Germans, in the age of Tacitus, were unacquainted with the use of letters; and the use of letters is the principal circumstance that distinguishes a civilised people from a herd of savages incapable of knowledge or reflection. 

Without that artificial help, the human memory soon dissipates or corrupts the ideas intrusted to her charge; and the nobler faculties of the mind, no longer supplied with models or with materials, gradually forget their powers; the judgment becomes feeble and lethargic, the imagination languid or irregular. 

Fully to apprehend this important truth, let us attempt, in an improved society, to calculate the immense distance between the man of learning and the illiterate peasant. The former, by reading and reflection, multiplies his own experience, and lives in distant ages and remote countries; whilst the latter, rooted to a single spot, and confined to a few years of existence, surpasses, but very little, his fellow-labourer the ox in the exercise of his mental faculties. 

The same, and even a greater, difference will be found between nations than between individuals; and we may safely pronounce that, without some species of writing, no people has ever preserved the faithful annals of their history, ever made any considerable progress in the abstract sciences, or ever possessed, in any tolerable degree of perfection, the useful and agreeable arts of life.


Individuals who do not read leave on the table all of the lessons of human history. They insist on learning everything from themselves rather than vicariously. This is what Gibbon observed when he wrote this passage in the late eighteenth century and his words ring just as true today.

But more troubling than the effect on any individual is the effect on a society where the population is oblivious of history. Such a society will fall victim to the same downfalls that plagued people who didn’t even know how to read or write!

As Mark Twain said, a person who won’t read has no advantage over one who can’t read.

And a society that fails to learn from history is doomed to repeat many of the same mistakes.

Obviously, our early twenty-first century society is mostly literate and never before has more information been available to more people than today. But the availability of information is of no value if people fail to take advantage of it. Clicking on the latest tabloid headline or clever meme tweet does absolutely nothing other than indulge in the endorphin boosting practice of context switching. We flit from one headline to the next which fosters the illusion that we are getting more informed even as we become more blind through an ignorance of the past.

Perhaps this is too negative. Clearly, a subset of society is very much interested in learning from the past and leveraging all that the internet has to offer. Charlie Munger jokes about the “cultists” who come to hear him speak, but the fact that the vast majority of his followers are on a real quest for wisdom is no joke at all. There are people who have seen the light when it comes to learning from the past, but I suspect the percentage of people who fall into this category is in the low to mid-single digits at best.


Forgot Password?

Join Us

Password Reset
Please enter your e-mail address. You will receive a new password via e-mail.