In recent months, there has been a lot of excitement around the idea of a new World Wide Web based on blockchains. It’s commonly referred to as “web3”, to be contrasted with “web1” (websites) and “web2” (social media platforms). The creators of Ethereum have been pushing this idea for a while, but the recent success of NFTs as an asset market has gotten lots of people excited that web3 is really happening.
There’s still the question of what web3 will actually do. This isn’t actually as important a question as you might think; when people started building websites in the 90s, no one really knew what the Web might ultimately be useful for. Sometimes humanity gets a cool new toy, and playing around and seeing what it’s useful for is more important than sitting around and theorizing about it.
Tim O’Reilly, a Silicon Valley veteran, has a good post cautioning that it’s too early to get excited about web3 — we’re still early in the Hype Cycle, and most of what’s now being done will eventually end up as “dot compost”. Of course he’s right. But if you just like playing around with new toys, then you probably don’t care about that — especially since most web3 stuff is well funded, unlike web1 which was mostly just a hobby for the early adopters. Now you can get paid to play around, so why not?
A more concerning critique comes from Moxie Marlinspike, the founder of Signal. Moxie notes that while web3 advocates tout decentralization as the main benefit, the new blockchain-based web is actually extremely centralized, because the devices that access web3 — mobile phones and web browsers — aren’t actually part of the blockchain. Thus, while web3 advocates call for “less trust, more truth”, in practice web3 requires users to trust whatever a few big platforms hand them:
One thing that has always felt strange to me about the cryptocurrency world is the lack of attention to the client/server interface. When people talk about blockchains, they talk about distributed trust, leaderless consensus, and all the mechanics of how that works, but often gloss over the reality that clients ultimately can’t participate in those mechanics. All the network diagrams are of servers, the trust model is between servers, everything is about servers. Blockchains are designed to be a network of peers, but not designed such that it’s really possible for your mobile device or your browser to be one of those peers…
So much work, energy, and time has gone into creating a trustless distributed consensus mechanism, but virtually all clients that wish to access it do so by simply trusting the outputs from [Alchemy or Infura] without any further verification. It also doesn’t seem like the best privacy situation. Imagine if every time you interacted with a website in Chrome, your request first went to Google before being routed to the destination and back. That’s the situation with ethereum today.
He goes on to describe making an NFT that appears to some people as a geometric shape, and to others as a poop emoji. The NFT was taken down by the OpenSea marketplace, without explanation any apparent violation of the terms of service. So much for decentralization and leaderless consensus!
But I have a third concern about web3. So far, the main difference between web3 and webs 1 & 2 seems to be that web3 allows you to pay for stuff in cryptocurrency. For example, Sal Delle Palme has the following list of applications:
New applications for crypto, such as NFT marketplaces, DAOs, DeFi and DEXs, CeFi, charities, GameFi, DeSo, etc., are being invented, funded (often by the crowd), built, and shipped with blinding speed.
This suggests that the main attraction of web3 might not be decentralization, but rather what economists call excludability — it will be stuff people pay for, rather than free stuff.
Currently, the web is monetized mostly by either ads or subscriptions (or by companies offering you free stuff and then selling your data). Over the years, there have been various attempts to switch to a system of micropayments. But despite a few limited successes (e.g. buying one song from a music service), these attempts have generally failed. And I think there’s a very deep and fundamental economic reason why they keep failing: non-monetary transaction costs.
The dystopia of transaction costs
One of the first popular posts I wrote on my old blog was called “Do property rights increase freedom? (Japan edition)”. In it, I challenged the notion that assigning property rights to everything is consistent with human liberty, and cited some examples of how Japan made people pay for a lot of stuff that was free in America.
I was thinking about re-upping this post, but I decided against it, because since I wrote this post, Japan has actually improved on many of these measures — there are a lot more park benches and trashcans, many cafes no longer insist you keep consuming product in order to stay inside, etc. (It’s almost as if people in Japan read my blog!) But I’ll quote a large part of that post, because I think I did a good job of explaining why past a certain point, property rights just make life more of a hassle:
Since the dawn of time, libertarians have equated property rights with freedom. Intuitively, this makes a lot of sense: if the government can come and confiscate your stuff, or tell you what to do with it, you don't feel very free at all. But libertarians tend to take this basic concept to its maximal extent; the more things are brought within the cash nexus, the more free we become. No limits, no exceptions. A direct implication is that the more government functions we can privatize, the more free we will be.
But is that right? What would it really feel like to live in a society where almost every single thing is privately owned and priced?…In [Japanese]cafes, each customer must order something promptly or be kicked out; outside your house or office, there is basically nowhere to sit down that will not cost you a little bit of money. Public buildings generally have no drinking fountains; you must buy or bring your own water. Free wireless? Good luck finding that!
Does all this private property make me feel free? Absolutely not! Quite the opposite - the lack of a "commons" makes me feel constrained. It forces me to expend a constant stream of mental effort, calculating whether it's worth it to spend $4 to sit and rest for 10 minutes, whether it's worth $2 to get a drink…Noticing all this has driven home the realization that the existence of a government-owned "commons" often makes people feel more free, not less. Sure, the commons is financed through taxation, and sure, that means that people generally don't receive benefits exactly equal to what they pay. But the difference can be small, and is often canceled out by the fact that you only pay once rather than a million billion times.
That's right: irreducible transaction costs are a fly in the libertarian soup. Completing an economic transaction, however quick and easy, involves some psychological cost; you have to consider whether the transaction is worth it (optimization costs), and you have to suffer the small psychological annoyance that all humans feel each time money leaves their bank account (the same phenomenon contributes to loss aversion and money illusion). Past a certain point, the gains to privatization are outweighed by the sheer weight of transaction cost externalities. (Note that transaction costs also kill the Coase Theorem, another libertarian standby; this is no coincidence.)…
[J]ust imagine taking [privatization] to its absurd extreme. Imagine if we could privatize city streets and create ownership rights for the air. Every time you walked out your door, you would have to pay some fraction of a cent for the privilege. Every time you took a breath, you would pay a far tinier fraction for the chemical changes caused by your respiration. These prices would be fairly close to your willingness-to-pay, and these prices might change from day to day, or even hour to hour! So you would probably have to check to see whether it was worth it to step outside your house. Does that sound like "freedom"?
We can do a similar reductio ad absurdum for a micropayments-based web. Imagine if everything you do online required you to decide whether to make a tiny payment. Send an email? Pay a few cents. Read one more paragraph of an article? Pay a few cents. And so on.
It would be an utter nightmare. The psychic cost of having to decide whether to pay a tiny amount for a tiny piece of product, dozens or hundreds of times a day, would be enormous. Some people would just choose not to deal with the hassle, and instead to simply use a ton of paid services and see their bill at the end of the month, like they do when using electricity in their house; but this carefree attitude would naturally lead them to buy far more than they really wanted, and when they saw a few of those monthly bills, they would reconsider.
In the end, most of these users would likely migrate back to either free ad-supported services or to subscription services that only make you think about payments once in a while.
Web3 needs to be more than making free stuff unfree
This is why the people trying to build web3 should probably steer away from making it just “micropayments, but in crypto”. I know this might sound crazy, but having to pay for stuff is not a feature. I am going to go ahead and predict that the added allure of being able to pay for things in a form of money that (nominally) isn’t controlled by the Federal Reserve will not be enough to make micropayments succeed where before they have failed.
(And note that the transaction costs work on the receiving end too! Lots of ideas I see for web3 applications involve people getting paid to do stuff like post on social media. But thinking about how much I might earn before every tweet wouldn’t be worth the pittance I’d inevitably receive. So the transaction cost argument is not just saying “cheaper is better”.)
Instead, I’d advise web3 creators to focus on stuff that blockchains can do more naturally and easily than normal protocols. For example, blockchains might allow one person to own the same pseudonym across a variety of platforms, so that if you see a post from someone calling themselves “Noahpinion”, you know it’s the same person who writes that famous blog (I know of some people working on applications for this). You can probably build out smart contract systems that allow micro-startups, like in the book Rainbows End, and maybe connect these ad-hoc partnerships to verifiable workflows. File-sharing might become a more trustworthy process, and so on.
Those are just a few ideas from someone who’s new to the field, and people are working on all of them already. But my basic point here is not to forget transaction costs. Lots of transaction costs aren’t paid in dollars — they’re paid in seconds of effort, in the mental cost of attention-switching, and in the hassle and stress of keeping things in mind. Web3 should focus on stuff that allows people to pay fewer of those costs, rather than stuff that forces them to pay more.
Here in Vietnam you don't get free water in restaurants. You would not *believe* how irate this makes Americans. There's even an American comfort food restaurant here, Eddie's Diner, who make a big marketing point out of how everyone gets a free glass of water.
I guess that's just my roundabout way of agreeing with your point.
Micropayments are awesome for people above the threshold of caring about a small payment. If you are a blockchain software engineer, this world makes sense, as you make 500K and fifty cents to use the bathroom isn't going to rise above your "give a shit" threshold. In fact, you welcome it, explicitly because it excludes freeloaders from shared services.
Highways in CA have microtransactions to use the fast lane today - no blockchain needed. Most of this could be done with merely efficient transaction processing, which ironically, blockchains are very bad at.