Beating the system

is an excellent book edited by Russell Ackoff and Sheldon Rovin (isbn 1-57675-330-1). As usual I'm going to quote from a few pages:
Beating the system can be a serious and occasionally risky business; some courage is needed.
Bureaucrats are usually empowered to say no to even reasonable requests, but they cannot say yes to them. This requires passing requests to a higher authority. Saying no inflates bureaucrats' self-images.
Nothing beats knowing how the system you're trying to beat actually works because the chances are that no one else knows, with the possible exception of secretaries.
We do not try to cure headache by perfuming brain surgery. Rather, we put a pill in our stomach.
The usual way of doing things often gets in the way of doing things.
Time is our only absolutely nonrenewable and, thus, most highly valued resource. To place a low value on another's time is to show a lack of respect for that person.
Persistence is a fundamental attribute of a system beater.
In most workplaces explicit and implicit assumptions often constrain behaviour.
Implicit assumptions lead to behaviours that are carried out automatically, without thought, and these behaviours constitute an organisation's or a society's culture. Culture is what we do when we do not consciously decide what to do.
Most assumptions made in and about organisations usually go unquestioned, and their validity is taken to be self-evident, despite Ambrose Bierce's (1967,289) admonition that "self-evident" means evident to oneself and no one else. "Obvious" does not mean "requiring no proof" but "no proof is desired."
No particular virtue exists in doing things the way they've always been done and in thinking about things in the way they've always been thought about.
Undesirable customs persist because they are tolerated without thought.
Effective police officers think like criminals; effective criminals think like police officers.
The major obstructions between a person and what he or she wants is not "out there" but in the person's mind.
A problem tends to be placed into the discipline of the one who first identifies it.
Problems are not defined by disciplines, although disciplinarians think so.

Sclumberger Japan



I visited Japan for the first time recently. I taught an Object Oriented Analysis and Design course for Schlumberger at Machida. The software metier there is a lovely man called Shin'ichi Watanabe who made me feel very welcome. He took me to a proper Japanese restaurant (which seated at most about 8 people) where you had to take your shoes off. The food was delicious as was the saki.

Leverage points

Is an excellent online pdf by Donella Meadows. As usual here are some snippets that spoke to me:
Counterintuitive. That's Forrester's word to describe complex systems.
Parameters are the points of least leverage on my list of interventions.
That's the difference between a lake and a river. You hear about catastrophic river floods much more often than catastrophic lake floods, because stocks that are big, relative to their flows, are more stable than small ones.
Often you can stabilise a system by increasing the capacity of a buffer. But if a buffer is too big, the system becomes inflexible. It reacts too slowly.
If you're trying to adjust a system state to your goal, but you only received delayed information about what the system state is, you will overshoot and undershoot.
Even with immense effort at forecasting, almost every centralised electricity industry in the world experiences long oscillations between overcapacity and undercapacity. A system just can't respond to short-term changes when it has long-term delays.
Delays that are too short cause overreaction, "chasing your tail", oscillations amplified by the jumpiness of the response. Delays that are too long caused damped, sustained, or exploding oscillations, depending on how much too long. At the extreme they cause chaos. Overlong delays in a system with a threshold, a danger point, a range past which irreversible damage can occur, cause overshoot and collapse.
The great push to reduce information and money transfer delays in financial markets is just asking for wild gyrations.
They [feedback loops] may not be very visible. But their presence is critical to the long-term welfare of the system. One of the big mistakes we make is to strip away these "emergency" response mechanisms because they aren't used often and they appear to be costly.
Democracy worked better before the advent of the brainwashing power of centralised mass communications.
A global economy makes necessary a global government.
Reducing the gain around a positive loop - slowing the growth - is usually a more powerful leverage point in systems than strengthening negative loops.
Missing feedback is one of the most common causes of system malfunction.
As the fish get more scarce and hence more expensive, it becomes more profitable to go out and catch them. That's a perverse feedback, a positive loop that ends in collapse.
Power over the rules is real power.
The most stunning things living systems and social systems can do is to change themselves utterly by creating whole new structures and behaviours. In biological systems that power is called evolution. In human society it's called technical advance or social revolution. In systems lingo, it's called self-organization.
The ability to self-organize is the strongest form of system resilience. A system that can evolve can survive almost any change, by changing itself.
One aspect of almost every culture is the belief in the utter superiority of that culture.
Societies… resist challenges to their paradigm harder than they resist anything else.
The power to transcend paradigms… It is to let go into Not Knowing, into what Buddhists call enlightenment.
It is in this space of mastery over paradigms that people throw off addictions, live in constant joy, bring down empires, found religions, get locked up or "disappeared" or shot, and have impacts that last for millennia.