[ID] => 10149
[post_author] => 34
[post_date] => 2018-09-25 10:32:26
[post_date_gmt] => 2018-09-25 09:32:26
[post_content] => By Arend van Campen
I recently read an interesting article by Richard A Clarke and RP Eddy called 'Why visionaries who can accurately predict looming disasters are often ignored’.
The authors wrote: Cassandra was a beautiful princess of Troy cursed by the god Apollo. He gave her the ability to foresee impending doom but the inability to persuade anyone to believe her.
I could not help feeling touched by this article as it confirmed exactly what I am experiencing myself. I have been writing this column and my blog for some years now, but despite the truthfulness and predictions based on logic and empirical research, most of what I write seems to be ignored.
The column which I write for HCB and my publications on LinkedIn are well researched. I trained myself as a cybernetician and a systems scientist, because by combining scientifically sound insights obtained from these sources, I learned how to predict looming disasters. I even developed a measurement or mapping tool to do this. It measures the so-called limitations of reality, for which I coined the term Realimiteit
What Clarke and Eddy write about Cassandra is that, although her predictions were sound and logical, people did not want to listen to them. Neither was she able to convince the people she was right. Thus Troy was destroyed after the Trojan Horse was allowed to enter the city.
Information, in this case in the form of a prediction, even perhaps of negligible value, still must be contemplated when decisions are made. Not considering even a little part of it will alter the outcome beyond control (the ‘Butterfly Effect’). The suppression of information for ulterior motives - for example, a prediction based on science, an observation that is being deemed irrelevant, or suggestions by those who are able to foresee things - directly destabilises intended results and makes them uncontrollable. Predictions and facts should be weighed and used as feedback in order to steer an organisation, because only then it learns and can adapt.
Cassandra knew this, as do many philosophers who ask 'ALL' questions and can reliably synthesise what is going to happen next. The same applies for our hazardous cargo industries. HSE failures and risk can only be prevented by using all information and by listening to Cassandras. Perhaps Cassandra was an early cybernetician? I feel she certainly was a ‘systems thinker’.
An example: the Macondo Oil Spill Disaster, which cost BP about $62bn, was in fact caused because information about the instability of the foundation of the cemented wellhead/Christmas tree was ignored. People and instruments warned about it, but these predictions were disregarded. Earlier I explained the value of requisite variety and the creation and sustenance of viable systems. This means that any operation, business or process undertaken can be maximally control by accepting even the smallest pieces of information and using them to govern our intentions. This will prevent disaster, because now you can predict catastrophe by epistemological knowledge using systems science and cybernetics. If you happen to know Cassandra, please listen to her.
This is the latest in a series of articles by Arend van Campen, founder of TankTerminalTraining. More information on the company’s activities can be found at www.tankterminaltraining.com. Those interested in responding personally can contact him directly at firstname.lastname@example.org.
[post_title] => Learning by Training: The Cassandra Effect
[post_status] => publish
[comment_status] => open
[ping_status] => open
[post_name] => learning-training-realimiteit
[post_modified] => 2018-09-25 12:39:58
[post_modified_gmt] => 2018-09-25 11:39:58
[post_parent] => 0
[guid] => https://www.hcblive.com/?p=10149
[menu_order] => 0
[post_type] => post
[comment_count] => 0
[filter] => raw
Learning by Training: The Cassandra Effect