Amber is a warning in traffic lights, between Red for 'stop' and Green for 'go', it tells us
to take care. In complex systems we also have the equivalent of those amber lights, they similarly warn us against taking our normal behaviours for granted. Complex systems, especially human ones, have many values, each of which is important to the overall behaviour of the system. Ignoring some of these values, whilst maximising selected others can cause the complete collapse of the system. This occurs due to the interdependence of our values, we cannot (as is done in traditional science) treat them in isolation (e.g. in 'controlled' experiments) and assume that they all behave independently - since as soon as we put them together again all hell breaks loose !
Nonlinearities, due to these interconnections, often lead to the system behaving in quite the
opposite way to that predicted from 'experiment'. If we then escalate what we are doing further, in order to 'correct' the apparent error in achievement, then we simply drive the system more incessantly along that road to disaster and collapse...