Most people do not fully choose the work they do.
That is one reason I have never liked the line: do something you love and you will never work a day in your life. It is too sugary for reality. It flatters the lucky, shames the constrained, and misunderstands where meaning often comes from. Meaning is not the absence of effort. It is often found on the way through something difficult. Meaning exists on the journey to the other side of hard.
We make it difficult, though, for people grappling honestly with reality to say that aloud. The useful lessons are often the least marketable. Motivation gets sold as sugar. Hardship gets edited into slogans. The rough edges are sanded off so the message can travel. We package encouragement as if truth must always come sweetened. But truth rarely arrives clean. And truth without space for repair is provocation.
That matters because we increasingly talk about intelligence as though it were exam intelligence. We imagine fixed answers, faster retrieval, cleaner optimisation, fewer mistakes. We talk about AI as if it were a giant answer engine arriving to solve the world. That framing is wrong in at least two ways. First, many of the most important human problems are not fixed-answer problems. They are relational, iterative, moral, institutional, and contested. Second, our understanding often misunderstands understanding. At the heart of insight there is more stillness, randomness, and infinity than we care to admit.
That may sound mystical, but it is practical. Some of the deepest forms of knowing are tacit. They sit in judgment, timing, feel, pattern recognition, and local knowledge before they ever become explicit speech. Adam Smith saw that local exchange contains information that no central planner can fully gather. Markets, under the right conditions, can release knowledge that nobody possesses in full. They help coordinate human action without requiring anyone to know everything.
That is one reason money matters more than moralists often admit. Money is not just greed tokens. Under the right conditions, it acts like crude artificial intelligence. It is a distributed signalling system. It coordinates effort, reveals scarcity, creates incentives, and allows dispersed people to respond to changing conditions. But only under quite demanding conditions. Rule of law matters. Creation matters more than extraction. Consequences matter. If those conditions break down, money stops being a decent release valve for tacit knowledge and starts becoming a tool of capture, distortion, and exclusion.
You can see both truths at once. In some parts of the economy, markets work remarkably well. Things get better and cheaper. Discovery happens. In other parts, the opposite seems to happen. Healthcare, law, education, and housing often feel like markets designed to fail. Prices rise faster than quality. Access narrows. A healthy market should usually make at least some things cheaper or better. When the price of something keeps rising without obvious improvement, we should at least ask whether the market is coordinating well or merely protecting incumbents, credential barriers, and administrative layers.
This is where David Graeber's critique in Bullshit Jobs becomes useful. We cannot deify struggle and work as if they are inherently noble. That has always been a convenient story for those who benefit from other people's constrained choices. Some friction has value, but not all friction is virtuous. Some difficulty deepens craft and meaning. Some difficulty is waste. Some is domination. Some is bureaucracy defending itself.
That is part of what interests me about AI. The disruption is not only that machines may perform some tasks better or cheaper. The deeper question is what happens if price discovery for monetisable problem solving ceases to be the engine that funds and filters so much of our other problem solving. We already do a vast amount of work for free. Parenting. Friendship. Open source. Volunteerism. Community care. Art. Thought. Most of life's meaning-rich activity is either badly paid or not paid at all. Money has never captured everything important. But it has still acted as a powerful filter for where effort, legitimacy, and attention go.
So what happens if that money filter starts to break down?
Markets have long served as one of our best decentralised coordination mechanisms under conditions of ignorance. People do not know the whole menu. Chance encounters, price signals, and partial knowledge lead to discovery. It is messy, but it works often enough to beat central planning. But what if artificial intelligence becomes a better coordination layer than chance market encounter? What if it can surface options, match needs, reveal paths, and coordinate effort more effectively than the old mix of price, search cost, and luck?
Then the old binary of market versus central planning starts to wobble.
We may be facing a third mode: machine-mediated coordination. Not a commissar directing the economy from above. Not pure market wandering below. Something stranger. A layer of prediction, recommendation, matching, and optimisation sitting between people and possibilities. Something that can show you opportunities you never knew existed, collaborators you would never have encountered, treatments you would never have considered. In some domains that could be extraordinary. It could lower waste, reduce search costs, and widen access.
But better coordination is not automatically better society.
We talk about AI as though it were singular. What if it is multiversal, as we are, if not more? Many models, many agents, many incentives, many institutional wrappers, many owners, many hidden goals. The need for rule of law does not disappear in that world. It becomes more important. Audit trails matter more. Institutional checks and balances matter more. If intelligence becomes more diffuse and more active in coordination, governance has to grow with it. This is not a case for central planning. It is a case for constitutional seriousness.
Nor should we become too Pollyannaish about optimisation. There must be space for chance, for wandering, for eccentricity and dissent. Recommendation systems that become too effective may start narrowing life under the guise of helping. If every path is pre-coordinated, some forms of discovery disappear. Soft destiny replaces open search.
At the same time, we should not romanticise the old world just because it had friction. Friction is not sacred. We should not defend waste merely because it feels human. Some problems need solving. Some jobs deserve to disappear. Some professional mystique deserves to be punctured. The point is to distinguish between frictions that protect freedom and frictions that protect control.
That may be one of the central questions of the next era.
Which frictions are waste? Which are domination? Which are the price of reality? Which protect freedom, meaning, and plurality?
The answer will not be singular because the world is not singular. The future coordination layer, whatever shape it takes, will not remove the need for judgment. It will intensify it.
Money was always a crude intelligence. Powerful, but crude. AI may become a more powerful coordination mechanism in some domains. It may release new forms of tacit knowledge while distorting others. It may reduce some waste while generating new dependencies. It may widen the menu while narrowing the soul. The real question is not whether optimisation wins. It is what kind of society we are coordinating towards, and what we refuse to optimise away.
Because if meaning still exists on the other side of hard, then the task is not to abolish all difficulty.
It is to stop confusing meaningful difficulty with needless suffering.
It is to keep space for truth to reveal itself slowly.
And it is to make sure that in a world of stronger coordination, there is still room for law, repair, dignity, and chance.

No comments:
Post a Comment