Scope insensitivity isn't one of the most frequently discussed cognitive biases, but I think it explains a lot of poor outcomes. This is especially in cases where there is minimal progress on a persistent problem despite apparent ongoing attention.
Scope insensitivity is our inability to properly value something in correct proportion to its scale. A canonical example is that people might be willing to pay roughly the same amount to save 200 birds from an oil spill as they would to save 20,000 birds, even though the latter outcome is 100X larger. The decision-making is not a mathematically-based cost-benefit calculation; it's perhaps based on a mental image of one bird, or the basic feeling that doing something to save the bird is consistent with one's identity. Eliezer Yudkowsky writes:
People spend enough money to create a warm glow in themselves, a sense of having done their duty. The level of spending needed to purchase a warm glow depends on personality and financial situation, but it certainly has nothing to do with the number of birds.
Scope Insensitivity - LessWrong 2.0
I think this model applies in some organisational decision-making processes. Some examples:
ASIC wants to use artificial intelligence and automation more but experts are concerned
In a media-driven political landscape, the scale of policies may be uncorrelated with actual needs, since the media and public audiences' scope insensitivity means there are diminishing returns to larger policies. A policy that funds 2000 manufacturing PhD scholarships is unlikely to get 10X the media coverage of one that funds 200; in either case the government will be able to say that it is doing something about manufacturing. The question of whether the manufacturing industry needs 200 or 2000 people doesn't necessarily enter the picture. Of course, media outcomes are not the only factor in policy development, but this pressure can drive towards the creation of many smaller policies that are great to announce but aren't ambitious enough to be effective in practice.
When allocating finite resources it is often hard to do so in a way that fully accounts for scope. A project that is worth 10X more than another project may not receive 10X the resourcing, because of factors such as a floor in the resourcing required to effectively deliver any project. More specifically on the cognitive front, it is hard for a product manager receiving input from multiple customers to correctly weight them by relative importance; it is hard to force yourself to discount the small project by 10X relative to the large project. One approach to this is to consciously break out of System 1 thinking and rationalise the process, as described here:
A variant of scope insensitivity is the classic "man bites dog" issue in journalism, where common events aren't newsworthy and so news is unrealistically dominated by rare events. People don't correctly devalue the event and its relevance to their life by its rarity, so the rare events are still appealing to read about, and drive coverage. In turn, this reporting may give a false picture of what society is actually like, usually in a negative way (i.e. an overemphasis on crime or mismanagement). Or perhaps more worryingly, it might drive over-reporting on relatively minor problems (e.g. one government employee commits fraud, stealing $10,000) while under-reporting on massive challenges (e.g. government programs not having their outcomes evaluated, wasting millions of dollars at a time). This probably also applies at a smaller scale inside organisations: people might over-index on an uncommon success or failure (e.g. one customer doesn't renew) rather than stepping back to calibrate a reaction based on the actual scale of the issue (e.g. that customer was only 0.01% of revenue).
This may also apply to corporate greenwashing and slacktivism, where a company might post a nicely sentimental tweet or take out an ad about an issue, rather than meaningfully tackling its deeper culpability. (For example, if Google posts about the dangers of extremism while simultaneously doing little to stop radicalising people through YouTube.) The executives might feel that “we’re doing our part” by speaking out. However, this is probably deliberate rather than unconscious much of the time, with the meaningful course having been considered and rejected because it’s too costly.
Example of first dot above: US Army reporting on having some progress against each strategic long term technology field. But not really considering whether the effort is sufficient in scale, speed, ambition, success.