Scope insensitivity isn't one of the most frequently discussed cognitive biases, but I think it explains a lot of poor outcomes. This is especially in cases where there is minimal progress on a persistent problem despite apparent ongoing attention.

Scope insensitivity is our inability to properly value something in correct proportion to its scale. A canonical example is that people might be willing to pay roughly the same amount to save 200 birds from an oil spill as they would to save 20,000 birds, even though the latter outcome is 100X larger. The decision-making is not a mathematically-based cost-benefit calculation; it's perhaps based on a mental image of one bird, or the basic feeling that doing something to save the bird is consistent with one's identity. Eliezer Yudkowsky writes:

People spend enough money to create a warm glow in themselves, a sense of having done their duty. The level of spending needed to purchase a warm glow depends on personality and financial situation, but it certainly has nothing to do with the number of birds.

Scope Insensitivity - LessWrong 2.0

I think this model applies in some organisational decision-making processes. Some examples:

ASIC wants to use artificial intelligence and automation more but experts are concerned

Example of first dot above: US Army reporting on having some progress against each strategic long term technology field. But not really considering whether the effort is sufficient in scale, speed, ambition, success.

19FF4D7B-2F30-493D-95FE-255558B208EB.jpeg

US Army secretary looks to 2040 to scale key tech