Book Review - Inadequate Equilibria

Categories: Book Review   Epistemics

Written on April 25, 2021

If your idea is so good, why hasn’t it already been done before?


You come up with a new idea: a startup; a policy to reshape the medical system; a solution to science’s replication crisis; the reason why Apple stock is undervalued. But if your idea is so good, why hasn’t it already been done before? Is everyone else blind/stupid? Are you a genius? Did you just get lucky to be in the right place at the right time? Or is there a fundamental reason why others have thought up or even tried your idea but decided not to do it1?

How can we think about the efficiency, exploitability, and adequacy of different markets/societal structures to help inform us if the $20 bill we see lying on the floor in front of us is genuine or an illusion?

Inadequate Equilibria (available for free online!), seeks to address this important question by providing a taxonomy and examples of where we stand little chance of finding $20 bills and where they might be more numerous.

Taxonomy of markets/societal structures:

  • Efficiency - can you as the average observer, without proprietary information, predict what will happen next? If not then the market is efficient.
    • Yudkowsky arges (and I agree) that the most efficiently priced thing in the world is the: “Short term relative pricing of liquid financial assets like the price of S&P 500 stocks relative to other S&P 500 stocks over the next three months.”
    • This is because it ticks all the boxes for efficiency: stocks are highly liquid, anyone can participate, there is a direct profit motive, tight feedback loops for success, lots of historic data, a strong pipeline of high IQ talent etc.
    • Whether you like it or not the world really does care if MSFT stock will be valued at $35.70 or $35.75 tomorrow.
    • Note that efficiency conveys no value judgement, it is just whether or not the market/system has already baked in all previous information. There is no “free energy” that can be consumed by any hungry organisms.
  • Exploitability - unexploitable markets are inefficient (and efficient markets are exploitable). The exploitability of a market is the extent to which you can participate in and profit from it.
    • The prediction market PredictIt, is notorious for having uncalibrated predictions that deviate from other better markets and often present pure arbitrage opportunities. Surely this leaves free money on the table?!
    • No. Because PredictIt has a betting cap of ~$800, has 10% fees on any profits and another 15% fees on any withdrawals. All of these restrictions severely limit upside profits, making the market unexploitable and inefficient.
    • Don’t fall into the trap of thinking that if only these restrictions were lifted you could make money because the second they are lifted is the second the market would be efficient and leave no opportunity for you to profit!
    • One interesting consideration is that some markets can’t be exploited because they cannot be shorted or have regulations around participation (eg. needing to be an accredited investor to invest in startups). You can only indirectly short real estate markets and not specific assets (how would you short a synthetic version of a specific house) and this makes real estate markets more prone to bubbles.
    • Yudkowsky even highlights research that found 5% higher returns for an index of stocks that are more easy to short, because they are not as overpriced2.
    • Exploitability comes in degrees and is in the context of all possible market participants. If you are an accredited investor and can invest in startups, you may be able to “exploit” the market but as a whole it is quite unexploitable and therefore likely to be inefficient. Moreover, while you can participate the extent you can exploit the market is capped because you cannot do things like short it.
  • Adequacy - the subjective value judgement how well a market achieves a desired outcome.
    • There are cases where a market will fail to perform an outcome for a specific individual, and also cases where, due to coordination problems, the outcome is suboptimal for most or all people involved.
    • There are many examples of coordination problems, one close to my heart is academic publishing. To present it simply: academics don’t want to pay for expensive, closed access journals like Nature and Science but must in order to get citations and tenure. Academic tenure committees, similarly don’t want to support these journals but must use them because it is where all the best academics publish. If everyone collectively decided to stop publishing in Nature then this problem would be solved. Trying to make change as an individual actor leaves you jobless. The system remains because we lack the coordination to make a change and everyone is stuck in a suboptimal situation.
    • One hypothetical way around coordination problems is to create a platform where people can pre-commit to change their behaviour or do something if and only if a sufficient threshold of others do the same. Yudkowsky tongue-in-cheek makes the hilariously depressing observation that we already have a platform that does this but we only use it to fund trinkets and video games, it’s called Kickstarter!
    • I think the best part of the book is the deep dive into coordination problems and inadequacy with some ideas for how these issues could be fixed, bringing into light how insane our system is. This can be found in Chapter 3 here.

Unsurprisingly given the title of the book! Yudkowsky is most concerned with inadequate equilibria! He tries to lump the sources of inadequacy into three categories (largely for mnemonic reasons):

  1. Decisionmakers who are not beneficiaries;
  2. Asymmetric information; and above all,

  3. Nash equilibria that aren’t even the best Nash equilibrium, let alone Pareto-optimal.

In other words:

  1. Cases where the decision lies in the hands of people who would gain little personally, or lose out personally, if they did what was necessary to help someone else;
  2. Cases where decision-makers can’t reliably learn the information they need to make decisions, even though someone else has that information; and
  3. Systems that are broken in multiple places so that no one actor can make them better, even though, in principle, some magically coordinated action could move to a new stable state.

Yudkowsky writes a lot of criticism about the medical system that is frustrating to read. While these inadequate systems can be hard to fix, it is sometimes possible to carve a niche for yourself. An interesting personal example is of his wife having seasonal depression and traditional light therapy failing. He spent $600 of a shitload of lights and stringing these up worked! There was nothing online or in research papers on this being a solution. He could have not bothered trying by concluding that this was not a solution because surely someone would have already done it before… but alas given the incentives around the medical industry and research nobody actually did this before, or at least never shared it…

The second part of the book then is a tirade against modesty for those who are aspiring rationalists. The Slate Star Codex review goes deeper into this then I’d like to and like Scott Alexander, I have mixed feelings about it. My overall take home when considering when one can beat the status quo at something is to apply the above taxonomies in thinking about why I might have an edge and considering the object level issues for this specific scenario, rather than generalizing to thinking about whether or not this says I am a modest person.

Things I highlighted/other takeaways:

Dunning-Kruger effect. I was aware of this effect but not its specifics. Those who are bad at something are overconfident, those who are better know they are better but not in the correct proportion: In one study the bottom 50% of performers thought they were better than those at 60%. The top 50% thought they did better than those at 70%.

Central line infections in the US killed 60K patients per year and infect 200K, costing $50,000 per patient. A 5 item checklist including washing your hands has dropped deaths by 50% and has only occurred recently… Affordable Care Act federal payments linked to central line infection rates may have been a key reason why.

Interviewers are not incentivised to hire the best people, instead hire those they would most like to work with/be on the same floor with. It would be interesting to explore recruiting and interviewer incentivisation where there is compensation as a function of how well those an interviewer chose to hire later perform.

The ivory tower. This dive into education is particularly interesting so I copy it in full here:

CECIE: I’ll now introduce the concept of a signaling equilibrium.

To paraphrase a commenter on Slate Star Codex: suppose that there’s a magical tower that only people with IQs of at least 100 and some amount of conscientiousness can enter, and this magical tower slices four years off your lifespan. The natural next thing that happens is that employers start to prefer prospective employees who have proved they can enter the tower, and employers offer these employees higher salaries, or even make entering the tower a condition of being employed at all.5

VISITOR: Hold on. There must be less expensive ways of testing intelligence and conscientiousness than sacrificing four years of your lifespan to a magical tower.

CECIE: Let’s not go into that right now. For now, just take as an exogenous fact that employers can’t get all of the information they want by other channels.

VISITOR: But—

CECIE: Anyway: the natural next thing that happens is that employers start to demand that prospective employees show a certificate saying that they’ve been inside the tower. This makes everyone want to go to the tower, which enables somebody to set up a fence around the tower and charge hundreds of thousands of dollars to let people in.6

VISITOR: But—

CECIE: Now, fortunately, after Tower One is established and has been running for a while, somebody tries to set up a competing magical tower, Tower Two, that also drains four years of life but charges less money to enter.

VISITOR: … You’re solving the wrong problem.

CECIE: Unfortunately, there’s a subtle way in which this competing Tower Two is hampered by the same kind of lock-in that prevents a jump from Craigslist to Danslist. Initially, all of the smartest people headed to Tower One. Since Tower One had limited room, it started discriminating further among its entrants, only taking the ones that have IQs above the minimum, or who are good at athletics or have rich parents or something. So when Tower Two comes along, the employers still prefer employees from Tower One, which has a more famous reputation. So the smartest people still prefer to apply to Tower One, even though it costs more money. This stabilizes Tower One’s reputation as being the place where the smartest people go.

In other words, the signaling equilibrium is a two-factor market in which the stable point, Tower One, is cemented in place by the individually best choices of two different parts of the system. Employers prefer Tower One because it’s where the smartest people go. Smart employees prefer Tower One because employers will pay them more for going there. If you try dissenting from the system unilaterally, without everyone switching at the same time, then as an employer you end up hiring the less-qualified people from Tower Two, or as an employee, you end up with lower salary offers after you go to Tower Two. So the system is stable as a matter of individual incentives, and stays in place. If you try to set up a cheaper alternative to the whole Tower system, the default thing that happens to you is that people who couldn’t handle the Towers try to go through your new system, and it acquires a reputation for non-prestigious weirdness and incompetence.

”Part of the function of any civilized society is to appropriately reward those who contribute to the public good.”

“Moving from bad equilibria to better equilibria is the whole point of having a civilization in the first place.”

Velcro is superior in many ways to shoe laces. But those who needed it the most, the young and old, made it so that it was uncool. And as a result nobody uses it now even though it’s superior for just about everything but intense running. The same thing has happened with Crocs.

Anesthesiologists have reduced their patient’s death rates in the past few decades by 100x. The reasons why this has happened are likely partly because only one is 1 assigned to a patient so it is easy to attribute failure to them and there is no personal connection with the patients so they are easier to sue. Specific incentives that lead to differential changes in performance are interesting!

It is one thing to have systems like medicine fail. It is another to have our systems that produce alternative approaches to enable trial and error also fail…. There is a nice analysis of our electoral system being First Past The Post and the problems this creates with a two party system where people cannot vote for their true preferences. Yudkowsky relates this to a problem in Venture Capital Funding with “Keynesian beauty contests” whereby Angel VCs only fund ideas that they believe later series A, B, C etc. VCs will also fund, else the startup will collapse. This leads to strong conformity with what people believe that other people believe should be funded. And is a positive feedback loop because these are the only things that get funded so it reinforces everyone’s data on what gets funded! The same thing happens with first past the post. Prospera might be the first example of a new charter city that may actually create some inspiring innovations ranging from medical/drug licensing to property sales and governance.

The NYT determines what people believe that other people believe hence why it can have so much influence.

Gell-Mann Amnesia effect describes forgetting how unreliable a source is in one area when you trust it in another area.

Bet on everything. It makes it more likely you will learn your lesson. And update from personal experiences, it is rare and a better signal than theory or the anecdotes of others.

It is far easier to pick amongst experts and existing opinions than it is to become an expert yourself and create your own original ideas.

“If you never fail, you’re only trying things that are too easy and playing far below your level.”

In the book Superforecasting the strongest predictor of success, 3x more influence than intelligence, was the belief that you could improve and do better than the experts.

If you liked this review consider reading this review by Slate Star Codex that is far superior!

Thanks to David Rein and Miles Turpin for reading a draft of this piece and providing useful edits and discussion! All errors and confusions that remain are mine and mine alone.

Footnotes

  1. For those familiar with Chesterton’s Fence, this problem can be thought of as Chesterton’s Absence of a Fence - “I shouldn’t build a fence here, because if it were a good idea to build a fence here, someone would already have built it.” 

  2. Apparently this index of stocks does not exist in reality. Someone should make it. But watch out for r/wallstreetbets! I’ve heard they don’t like shorts ;).