From The Jetsons to Juicero: making the world a more comfortable place
For the last decade, technology promised to ‘make the world a better place’. But is it delivering? Let's take a look at some products and services that received substantial media attention and funding since the early 2010s to find out:
Yo allowed users to send the word "Yo" to their contacts.
Bodega, then Stockwell, aimed to replace corner stores by installing vending machines in residential buildings (ideal for convenient toilet paper replacement. Those machines spared me some pretty desperate convenience store runs when I lived in San Francisco)
Juicero sold a $400 juicer that was essentially a glorified bag-squeezer.
Thync developed a wearable device that claimed to induce relaxation or stimulation through electrical signals sent to the brain.
LuminAID created a solar-powered inflatable lantern.
Pavlok is a wearable device that delivers electric shocks to users when they engage in "bad habits" like biting their nails or oversleeping.
Cuddlr, then Spoonr, allowed users to find people in their area to cuddle with
Airpnp allowed people to rent out their bathrooms to strangers for a fee.
Hater is a dating app that matches people based on their shared dislikes.
Cicret promised to create a bracelet that would project a touchscreen onto your skin, allowing you to control your phone without actually touching it.
Amidst the deluge of tech discoveries and inventions promising utopia, the reality remains underwhelming. The majority of the list addresses niche needs. For many entrepreneurs, the tongue-in-cheek aim is to supply everything their college campuses (or mothers) no longer provide—an infantilized version of The Jetsons' future. When everything is dubbed "world-changing," what truly is?
Clay Tarver, writer and producer of HBO's satirical "Silicon Valley," notes some big tech companies have forbidden employees from saying "We're making the world a better place." Tarver jests, "At the very least, we're making the world a better place by making these people stop saying they're making the world a better place."
If you think the list of past flops might be biased, you may have a point. However, hindsight is always twenty-twenty. In reality, even recent tech trends are rife with startups suffering from a Napoleon Complex. Economist Robert J. Gordon contends that today's AI, which focuses on pattern recognition, isn't as groundbreaking as electricity or the internal combustion engine, despite what technologists and investors claim. Don't expect a world-shattering revolution to happen anytime soon.
We're in a comfort crisis. The brightest minds in science and technology, from founders to influential investors, seem preoccupied with hype, external validation, and conformity. Prioritizing venture capital over sustainable businesses, chasing trends heedless of unique needs, and surrounding oneself with "yes" people rather than seeking critical feedback and diverse perspectives are all misguided endeavors.
Countless talented individuals opt for safe, incremental ventures, such as SaaS companies and cookie-cutter consumer startups. Although these pursuits may yield some benefits, they ultimately neglect crucial human challenges, including existential ones. This not only stifles progress but also represents a missed opportunity for gifted individuals to leave a lasting impact on the world. In essence, for any exceptionally capable person, shirking a noble quest isn't just neutral—it's a net negative for society.
Velocity is everything
A worthy endeavor should target a problem substantial enough to rightfully claim social impact credentials. Tough challenges with a low chance of success, yet a social promise enticing enough to overlook operational complexities—think artificial general intelligence (AGI), curing Alzheimer's disease, sustainable nuclear fusion, or even building a network state.
However, just because the ‘mission’ is ‘impossible’, it doesn’t mean it’s worth pursuing – like “shaping the future of web3 through storytelling, experiences, and community”, whatever all of that means. The difficulty axis is just one piece of the puzzle and a somewhat one-dimensional perspective on problems. Hard missions without an angle are an exercise of style at best, like the war on cancer, and pure fugazi at worst, like Theranos.
The need for ambitious endeavors to improve the world is driven by technological velocity, which is often measured by the degree of operating leverage - the ratio of fixed costs to variable costs, that explains how much of a company’s operating income changes in response to a change in revenues. Unit economics tells the true story of broad adoption. It’s either that or knowing a special secret, a glimpse of truth at a technological level that can flip the game on its head.
Moonshot builders can be grouped into two types: those with in-depth knowledge and understanding of technology, research, and fieldwork, and those who excel in execution and operations. The first group discovers bold breakthroughs, while the second accelerates development and ensures they become a reality. Both roles are critical in meeting the needs of society, and it's important to recognize one's unique strengths within this framework. Your author is an action-taker.
As a non-scientific, non-technical expert, I have very little to say about the process of scientific insight generation. But I am interested in both very large and very small patterns – how markets and human behavior change, as well as how the unit economics of a product, at its most atom-like level, can narrate the story of technological velocity.
When it comes to resource allocation – whether that is money, time, or energy – velocity is everything.
Three case studies
Exceptional quests are those that address significant societal issues while showcasing remarkable velocity. While assessing a quest's potential impact might be simple, understanding velocity is more complex. Velocity represents a learning factor, indicating the pace of change. It serves as a proxy for decreasing adoption costs, which are crucial for turning breakthrough technologies into widely adopted products.
The relationship between experience – measured as the cumulative installed capacity of the technology – and the price of that technology is called the learning curve of that technology. The relative price decline associated with each doubling of experience is the learning rate of a technology.
That more production leads to falling prices are not surprising – such ‘economies of scale’ are found in many corners of manufacturing. If you are already making dinner, it isn’t that much extra work to accommodate an extra guest.
For over six decades, the concept of technological learning has suggested that a technology's performance enhances as experience accumulates. I argue that the best way to anticipate all future technological advancements is to concentrate on paradigms that exhibit learning curves.
There’s no shortage of examples of innovations that benefitted from technological learning curves. LED lighting has emerged as an affordable, energy-efficient alternative to traditional bulbs. Electric vehicles now offer a price-competitive substitute for gasoline cars.
To get our expectations for the future right we ought to pay a lot of attention to those technologies that follow learning curves. Initially, we might only find them on a high-tech satellite out in space, but the future belongs to them.
Most technologies obviously do not follow a learning curve – the prices of bicycles, fridges, or coal power plants do not decline exponentially as we produce more of them. But those which do – like computers, solar PV, and batteries – are the ones to look out for. They might initially only be found in very niche applications, but a few decades later they are everywhere.
To get our expectations about the future right, I argue that today there are three important areas where upstream technological paradigms are exhibiting learning curves in their unit economics – semiconductors, cancer detection, and solar batteries.
Moore’s Law and Semiconductors
In the 1960s, Intel co-founder Gordon Moore observed that the number of transistors in an integrated circuit doubled at a clock-like pace, leading to greater processing power at the same cost. To Moore, there was no reason that progress couldn’t continue. And there hasn’t been.
As semiconductors became cheaper and more efficient, they penetrated increasingly more aspects of our daily lives. By and large, they continued to drive improvements in things we already use, from phones to cars to appliances. And of course, there’s chatGPT, which you’d have to be living under a rock at this point to not have heard of. Ask it anything and it’ll very confidently tell you… Well, something.
Morris Chang, known as the Godfather of Taiwan's chip industry, developed this concept into a revolutionary pricing model called 'learning curve pricing' or 'experience curve pricing,' which involved setting chip prices below initial costs to optimize production volume. As a result, he captured market share, enabling production lines to run at maximum capacity and reduce time to increase yields.
While superlatives cannot adequately describe the profound effects of the semiconductor industry on human progress, the expanding use of semiconductors is a clear sign of the fundamental importance of this industry that stood the test of time.
The minuscule transistorized computer chips made from semiconductors have become the lifeblood of our 21st-century information society, just as oil was to the industrial world of the 20th century. Without the ubiquitous electronic infrastructure powered by computer chips today, virtually nothing would run. Chip manufacturing may be the most complex, process-controlled industry the world has ever seen.
Moore’s Law accurately charted the pathway in size reduction and thus in performance for those transistors. Investors, producers, system manufacturers, and consumers alike all benefited from the Swiss watch-like regularity of the computer chip product cycles. Unlike no other business in the world, the semiconductor industry had an accurate, proven roadmap for its future…with clear technological velocity, orchestrated to the rhythm of Moore’s Law.
Flatley’s Law and early cancer detection
Cancer continues to be a pressing public health issue, with mortality rates decreasing by merely 19% since 1990, compared to a decline of over 50% for cardiovascular diseases. The COVID-19 pandemic has further exacerbated the urgency to address cancer due to missed screenings and diagnoses affecting millions. Nonetheless, advancements in diagnostic and therapeutic innovations offer the potential to significantly reduce cancer mortality rates.
Early detection is paramount, as cancer is a progressive disease, and advanced tumors account for a disproportionate number of deaths. Molecular testing is essential for precision therapy because it identifies tumor-specific mutations that guide oncologists in selecting the most effective treatments.
Approximately two decades ago, when the human genome was first sequenced, it took a team of researchers at the National Institutes of Health (NIH) nearly 15 years and over $2.7 billion to accomplish the task. Today, a human genome can be sequenced in roughly a day for around $500, and the cost is anticipated to soon drop to $100. Flatley's Law (named after former Illumina’s CEO), an analog to Moore's, demonstrates even faster advances and more significant cost reductions.
Flatley's Law highlights the declining cost of cancer detection tests, such as multi-cancer earlier detection (MCED) tests, which can identify multiple cancer types from a single blood draw. These tests have the potential to decrease cancer mortality rates by 15% at a $500 reimbursement price, marking the real turning point for widespread adoption.
Reimbursable tests are the Schelling point of cancer detection. With falling prices and continuous improvement in accuracy, multi-cancer earlier detection (MCED) can become a part of a routine check-up for most and avert an unprecedented number of cancer-related deaths.
Wright’s Law and solar batteries
To reduce emissions the world needs to rapidly transition towards a low-carbon energy system. Around three-quarters of global greenhouse gas emissions come from energy and industry. One of the barriers to this energy transition has been the relative cost of different energy sources. Fossil fuels were cheaper than renewables and therefore became the dominating sources of energy.
Thankfully this is changing quickly. The cost of renewable technologies has plummeted – they’re now cost-competitive or cheaper than new fossil fuels. In 2009, it was more than three times as expensive as coal. Now the script has flipped and a new solar plant is almost three times cheaper than a new coal one. The price of electricity from solar declined by 89% between 2009 and 2019.
But the cost of electricity technologies themselves is only part of what matters for this transition. One of the challenges that renewables face is that they produce energy intermittently. The sun doesn’t always shine and the wind doesn’t always blow and so we don’t get a steady flow of generation throughout the day. An obvious solution is to store excess energy and then release it later. But to do so, we need lots of energy storage and this adds large costs to our energy system.
The price of lithium-ion battery cells declined by 97% in the last three decades. A battery with a capacity of one kilowatt-hour that cost $7500 in 1991, was just $181 in 2018. That’s 41 times less. What’s promising is that prices are still falling steeply: the cost halved between 2014 and 2018. A halving in only four years.
To put this in perspective: the popular Nissan Leaf electric car – which is also one of the most affordable models – has a 40 kWh battery. At our 2018 price, the battery costs around $7,300. Imagine trying to buy the same model in 1991: the battery alone would cost $300,000.
What this means is that batteries have been getting smaller and lighter for any given electrical capacity. You might have noticed this yourself as your mobile phones were getting lighter and slimmer. This is a crucial technological improvement as one of the major drawbacks of some battery technologies is that they are heavy and this limits their use in a number of technologies that are still fossil fuel powered.
Making innovation irresistible
The history of technology is shaped by groundbreaking innovation reaching widespread adoption. But the vast majority of “world-changing” ventures functionally amount to economic noise. To achieve double-digit GDP, cure cancer and halt environmental decay we need a bold tech with straightforward economics. Bull markets, abundant venture capital funding, and the Zuckerberg-esque mythology of the rebel founder muddled the waters of the second half of this formula - celebrating outsized ambition not only as a necessary but also a sufficient condition. This is totally backwards.
Moreover, one of the very worst misconceptions about technological paradigms is that they happen overnight. They don’t. Societal technological revolutions are exceedingly rare and the implementation challenges are much larger than the technical considerations alone.
The chronicle of technological progress is interspersed with a history of setbacks. This is attributable to diseconomies of scale and the inability to achieve the necessary economic curves that lower prices and promote adoption. Nuclear power serves as a prime illustration of this phenomenon.
In many places, building a power plant has become more expensive over time. This is of course very unfortunate since nuclear is both a low-carbon source of electricity and one of the safest sources of electricity as we have seen. One reason for rising prices is the increased regulation of nuclear power. A second reason is that the world has not built many nuclear power plants in recent years so supply chains are small, uncompetitive, and are not benefiting from economies of scale. Learning curves, after all, mean transferring the knowledge gathered in one instance to another. No repetition, no learning.
Reducing the costs of innovative technological paradigms is crucial for achieving widespread adoption. The technical advantages of MCEDs compared to traditional biopsies or renewable energy over fossil fuels remain largely insignificant to society unless these innovations can achieve cost parity with their alternatives. The key indicators of progress lie in unit economics.
If you are actively involved in, investing in, or simply passionate about the technology industry, I encourage you to take a step back and carefully evaluate your work at a 10-minute grocery delivery startup or a tool designed for Web3 communities. Have a hard, critical look at their mission and consider whether you are building "an incredibly complicated piece of engineering" that just… squeezes giant ketchup sachets of fruit.