How people actually decide
How people actually decide, what the research has found, and why most decision-making frameworks oversell what reflection can do for hard choices.

People decide using two interleaved systems: a fast, intuitive one that handles most everyday choices, and a slow, deliberate one that gets engaged when the stakes are high or the situation is unfamiliar. Most decision-making advice is written for the slow system. Most decisions are made by the fast one.
That is the honest one-paragraph summary of what the field has found over the last fifty years. The popular literature on decision-making has spent the same fifty years selling frameworks (matrices, decision trees, "five steps to a better decision") that mostly assume you are operating in slow-system territory and have unlimited time and information. You usually aren't, and you don't.
This piece is descriptive, not prescriptive. It is not a guide to making better decisions. It is a guide to what the research has actually shown about how decisions get made, where the popular accounts oversimplify, and what one classical tradition added that the modern literature mostly leaves out. Mirror Field is not a decision-making tool. We think the decision is yours, and we think most attempts to optimize it from the outside fail in interesting ways. This post is about why.
The split that runs through the literature
The dominant frame in modern decision research is some version of dual-process theory: there are two systems of cognition, often called System 1 and System 2 after Daniel Kahneman's Thinking, Fast and Slow. The framing is older than Kahneman, and it isn't quite as clean as the popular version makes it sound, but the basic distinction is real and useful.
The fast track
System 1 is automatic, fast, and effortless. It is what gets you across a room, into a chair, through a familiar route, past a person you recognize, and into the right small talk. It is also what makes most of your buying decisions, most of your social decisions, and most of your reactions to news. It runs constantly, in parallel, beneath conscious deliberation. You only notice it when it is wrong, and you often don't notice even then.
The classic line of research on this system started with Tversky and Kahneman (1974) in Science, which catalogued the heuristics people use when they don't have time to compute properly. Anchoring (your estimate of an unknown number drifts toward whatever number you saw last). Availability (you judge how common something is by how easily examples come to mind). Representativeness (you judge how likely something is by how typical it looks). These were originally described as biases, the deviations from formal rationality that lab subjects produced reliably.
A different research tradition reframed the same heuristics as fast and frugal tools that work surprisingly well in the environments they evolved for. Gigerenzer and Goldstein (1996), in Psychological Review, showed that simple heuristics like take the best often outperform more complex algorithms when information is sparse and noisy, which describes most real situations. The two views are less opposed than they look. Heuristics are biased compared to formal rationality and reasonable compared to having no time and no information. Both things are true. Whether your specific decision will be served well by the fast track depends on whether your situation looks like the situation the heuristic was tuned for.
The slow track
System 2 is deliberate, sequential, effortful. It is what you use to add two two-digit numbers in your head, to read a contract carefully, to compare two job offers on multiple dimensions. It is rare in everyday life because it is expensive. You can run it for short bursts; it tires; it competes with other demands on attention.
The decision-making literature has found, repeatedly, that even when System 2 is engaged, its reach is shorter than people assume. Experts in fields with structured decisions (chess, firefighting, anaesthesiology, medical diagnosis) often appear to deliberate but are actually running large pattern-matching libraries built up over thousands of hours, then verifying the pattern with a brief slow-track check. Klein (2008), in Human Factors, formalized this as recognition-primed decision making: the decision is mostly made by recognition, and System 2 mostly serves to validate or override what recognition produced.
When each is at work
The fast track handles almost everything; the slow track engages when the situation is unfamiliar, the stakes are high, or the choice has been forced into language by a deadline or a third party. The practical question of which mode fits the moment in front of you, and what the pause actually does when the slow track engages, is covered in reflective decisions vs. reactive decisions.
What the research has actually found
A few foundational findings, with the caveats the popular literature usually omits.
Heuristics produce predictable errors
The catalogue of biases Tversky and Kahneman opened has grown for fifty years. Anchoring is robust. Availability is robust. Loss aversion (the finding that losses loom larger than equivalent gains) is robust enough to have shaped behavioral economics. Many smaller effects in the same family have failed replication or shrunk dramatically under more careful testing. The honest summary: a small number of broad biases reliably distort System 1 decisions in predictable directions; many specific framing effects you may have read about are smaller or shakier than the original papers suggested.
Choice overload is real, sometimes
The famous study is Iyengar and Lepper (2000): shoppers offered 24 varieties of jam in a tasting display were less likely to buy any than shoppers offered 6. The finding became the basis for popular books and "less is more" advice across consumer products and life decisions.
The replication record is messier. A meta-analysis by Scheibehenne, Greifeneder, and Todd (2010) in the Journal of Consumer Research, drawing on 50 experiments, found a roughly null average effect on choice probability. Choice overload appears to exist in some specific contexts (high-stakes decisions, novice domains, unconstrained options) and not in others (familiar domains, when default options are clear, when the person has well-formed preferences). The lesson is not that choice overload is fake. The lesson is that the conditions under which it operates are narrower than the popular version of the finding suggests.
Naturalistic decision-making looks different from lab decision-making
Most decision research before the 1990s used artificial tasks: gambles with known probabilities, jam tastings, classroom exercises. Klein and others argued that experts in real environments don't decide that way at all. They recognize a situation as similar to one they've handled before, generate a candidate response, mentally simulate it briefly, and either commit or revise. The slow weighing of multiple options that classical decision theory describes is rare in actual practice, even among very competent practitioners.
This finding has been controversial because it is hard to study experimentally. But it has held up in the field, and it suggests an uncomfortable thing about most decision-making advice: the canonical "list the options, weight the criteria, score each option" procedure is not how experienced people actually make hard calls in their domain. It is how they justify the calls afterward.
A classical view: the value of waiting

Most decision-research literature is recent. The act of having to decide under uncertainty is not, and the classical traditions thought about it carefully. The I Ching's fifth hexagram, 需 (Xū), called Waiting in the standard English, addresses one specific class of decision: the kind where the right move is not yet visible because the situation has not yet ripened.
The Image text reads, in the original:
雲上於天,需。君子以飲食宴樂。
A literal translation:
Clouds rise above heaven: the image of waiting. The noble one eats and drinks, takes ease, and lets the moment ripen.
(Translation by Mirror Field, working from the Chinese with reference to Legge, 1882.)
The hexagram's logic is specific. Some decisions resolve themselves if the person can hold the question without forcing it. The rain falls or doesn't; the clouds gather first. The noble one, in this image, is not paralyzed and is not anxiously deliberating. They eat, drink, take ease. They let the situation declare itself.
This is not a recommendation to procrastinate. The hexagram is paired in the I Ching with a clear understanding that some decisions cannot wait, that waiting can become its own kind of avoidance, and that the discipline of the hexagram is the discipline of correct waiting. But it captures something the modern decision literature mostly misses: that the question of whether to decide now is itself a decision, and that for some situations, the answer is no.
What this means for hard decisions
Hard decisions don't yield to algorithms. The decision-making advice industry sells frameworks because frameworks are saleable, not because they work on the kinds of decisions that are hard. The kinds of decisions that are hard share at least one of these features:
- The options are roughly comparable on the criteria you can name, and the difference between them is in something you can't quite articulate.
- The decision is irreversible, or close to it.
- The decision will affect people you care about in ways you can't fully model.
- The decision turns on values rather than information, and you don't have certainty about your own values.
Consider a person deciding whether to take a job in another city. They list the criteria: salary, distance from family, weather, work culture, partner's career, the strength of friendships in the current city, the unknowns of the new one. They weight each criterion zero-to-ten. They score each option. The new job wins by 1.4 points. They feel worse, not better, after the calculation. The reason is that the calculation forced them to assign weights, and they don't actually know how to weight friendships against weather. The framework produced a number; it did not produce a decision. The number is now hostage to whatever they put in the importance of friends row, which they made up.
This is the modal failure of decision frameworks on hard decisions: they generate a defensible-looking output that conceals the part where the person had to invent the weights. The weights were the decision. Sometimes the page itself is where this kind of decision lands — when journaling becomes the decision covers the specific shape this takes when it works.
For decisions like these, the research has found something the framework industry has not advertised: more deliberation does not consistently produce better choices. Past a certain point, slow-system thinking on a hard decision can degrade outcomes by surfacing reasons that fight each other to a draw, by inducing analysis paralysis, or by overweighting whichever criterion was most recently considered.
The practical lesson is not don't deliberate. The practical lesson is that deliberation has a job (mapping the decision more clearly) and a limit (it does not, by itself, pick the option). When deliberation has done its job, what remains is the act of choosing, and the act of choosing is not an algorithm. The opposite advice (just go with your gut) fails differently, but fails. Gut feelings on hard decisions are often the residue of whatever was most emotionally vivid in the previous twenty-four hours; treating them as oracles produces the same defensible-looking-output problem from the other direction.
Where reflection fits (without selling)
This is where the framing of what reflection actually does (covered in the self-reflection pillar) connects to decision-making, and it is also where most of what is written about reflection-and-decision-making goes wrong.
Reflection does not make the decision. Reflection makes the decision clearer: it helps you see what kind of decision you are actually facing, what you are actually carrying about it, what is information you don't have, what is preference you haven't named, what is values you haven't reckoned with. None of this picks the option. All of it changes the quality of your eventual pick.
A useful working distinction: most hard decisions sit on one of three substrates, and the substrate determines what kind of attention helps.
- Information-substrate decisions. The right answer would be obvious if you knew one or two specific things you don't know. The intervention is research, not reflection.
- Preference-substrate decisions. The options are different goods; you must choose which good you want. Reflection helps because it lets you notice which of the goods is actually the one you want, separate from what you think you should want.
- Values-substrate decisions. The options express different visions of who you want to be. No external advice can settle these. Reflection helps not by producing the answer but by surfacing the values clearly enough that you recognize which option is consonant with which.
If you have a specific choice in front of you that sits on one of the latter two substrates, a Mirror Field session is built for the kind of structured looking that works on them. It will not pick the option. It will help you see what choosing it would mean.
A small descriptive exercise

For your next meaningful decision, before you start deliberating about which, spend two minutes on what kind.
Write the decision down in one sentence. Then ask: is this a decision I would solve if I had more information? (If yes, the next move is research, not reflection.) If not, is this a decision about which of two real goods I want? (If yes, the move is to clarify which good actually matters to you, in this life, right now.) If not, is this a decision about who I want to be? (If yes, the move is values clarification, which is slower and which no algorithm can shortcut.)
Most decisions you struggle with sit on the second or third substrate. Most of the time you spend on them is spent in the first one, hunting for information that wouldn't actually settle anything.
Sources
- Gigerenzer, G., & Goldstein, D. G. (1996). Reasoning the fast and frugal way: Models of bounded rationality. Psychological Review, 103(4), 650–669. https://doi.org/10.1037/0033-295X.103.4.650
- Iyengar, S. S., & Lepper, M. R. (2000). When choice is demotivating: Can one desire too much of a good thing? Journal of Personality and Social Psychology, 79(6), 995–1006. https://doi.org/10.1037/0022-3514.79.6.995
- Klein, G. (2008). Naturalistic decision making. Human Factors, 50(3), 456–460. https://doi.org/10.1518/001872008X288385
- Scheibehenne, B., Greifeneder, R., & Todd, P. M. (2010). Can there ever be too many options? A meta-analytic review of choice overload. Journal of Consumer Research, 37(3), 409–425. https://doi.org/10.1086/651235
- Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131. https://doi.org/10.1126/science.185.4157.1124
- Legge, J. (trans.) (1882). The I Ching, or Book of Changes. Sacred Books of the East, vol. 16. Oxford: Clarendon Press. [Public domain. Cross-reference for the Hexagram 5 (需, Xū) Image text quoted above.]
You may like

Decision regret vs. decision quality
A decision can be high-quality and produce regret, or low-quality and produce relief. Why the two come apart, and what to actually evaluate when reviewing a choice.

Reflective decisions vs. reactive decisions
When the pause helps and when it hurts: how reflective and reactive decisions differ, what the research says, and a working test for which mode fits the moment.

Tversky and Kahneman in plain language
What the foundational heuristics-and-biases research actually showed, in plain language, with the replication caveats the popular versions usually leave out.