This group is the default community for every Energy Central registered member. We discuss and share a variety of topics related to the global power industry. 


You need to be a member of Energy Central to access some features and content. Please or register to continue.


"Are We Missing the Point of Risk Management Activities?"


The focus of this article is on the application of guidance (ISO 31000, FFIEC, etc.) often resulting in the appearance of compliance resulting from a checkbox perspective rather than actually and actively identifying and managing risk(s) by organizations.

In Risk Management: History, Definition and Critique, by Georges Dionne (March 2013 – CIRRELT-2013-17); the opening statement from the Abstract is revealing:

“Risk management has long  been  associated  with  the  use  of  market  insurance  to  protect  individuals  and companies  from  various  losses  associated  with  accidents.  Other forms of risk management, alternatives to market insurance, surfaced during the 1950s when market insurance was perceived as very costly and incomplete for protection against pure risk.  The  use  of  derivatives  as  risk  management  instruments  arose  during  the  1970s, and  expanded  rapidly  during  the  1980s,  as  companies  intensified  their  financial  risk management.  International risk  regulation  began  in  the  1990s,  and  financial  firms developed  internal  risk  management  models  and  capital  calculation  formulas  to  hedge against unanticipated risks and reduce regulatory capital. Concomitantly, governance of risk management became essential, integrated risk management was introduced and the first corporate risk officer positions were created.  Nonetheless, these regulations, governance rules and risk management methods failed to prevent the financial crisis that began in 2007.”

We see, all too often, organizations complying with conflicting regulatory guidance in order to preclude fines and maintain the appearance of compliance regardless of the cost (lost opportunities, financial minimization of gain, fear of fines, etc.).  Yet, few of these organizations actually consider the cost of poor risk management practices.

Why did the crisis cause such large op risk losses?

Here are the results of a Google Search that I did on crisis and operational risk loss.  It is interesting to note that the top 5 results mainly focused on the 2008 Financial Crisis:

Could one assume that only the financial industry is concerned about risk management; or that other industry segments do not have operational risk losses?  Of course it would be ludicrous to make either assumption.  Perhaps what we are seeing is that there is a lack of differentiation when it comes to operational risk losses.  That is to say that we may need to think of operational risk in terms of a hierarchy instead of a horizontal delineation.  I would suggest that three tiers of operational risk be created by organizations: Strategic, Operational and Tactical.  One could then create matrices that can be expanded upon to provide a linkage between and among the three tiers.

Before we go further, perhaps some definitions, that may or may not clarify things, should be presented:

Strategic Risk:  A possible source of loss that might arise from the pursuit of an unsuccessful business plan.  For example, strategic risk might arise from making poor business decisions, from the substandard execution of decisions, from inadequate resource allocation, or from a failure to respond well to changes in the business environment.  Read more: According to the post on Simplicable entitled, “22 Strategic Risks”, posted by Anna Mar, February 02, 2013 (


1. Corporate Governance Risk: The risk that insiders (employees) won't act in the best interest of the owners (stockholders) of a firm.


2. Strategy Execution Risk: The risk that business strategy execution will fail.


3. Strategy Forecast Risk: The risk that your business strategy will be off the mark. For example, invalid sales forecasts.


4. Competitive Risk: The risk of a decline in competitive advantage.


5. Innovation Risk: An inability to innovate (failed innovation investments). Some firms struggle to establish an innovation culture.


6. Technology Risk: The risk that your technology strategy will fail. For example, that your technology KPI will fall behind the competition.


7. Intellectual Property Risk: The risk of intellectual property loss and liability.


8. Merger & Acquisition Risk: Integrating firms is almost always a high risk activity.


9. Change Management Risk: The risks associated with organizational change.


10. Program & Project Risk: The risks associated with program & project failures. In some industries more than 50% of projects fail.


11. Marketing & Sales Risks: The risk that marketing and sales forecasts and metrics will fall short of expectations. For example, the risk of new product development failure.


12. Operational Risks: The risk of operations failures. For example, the risk that logistical problems will cause orders to be canceled.


13. Talent Management Risk: The risk of losing key talent to the competition.


14. Security Risk: The risk of a information security incident. Information security incidents can damage reputation, cause compliance issues and result in the loss of intellectual property.


15. Liability Risk: The risk that your products, services or corporate execution leads to legal liability issues.


16. Compliance Risk: The risk of non-compliance with regulations and law.


17. Sustainability Risk: The risk of missing sustainability targets or non-compliance with environment laws and regulations. Sustainability is increasingly important to reputation. It's a central theme of the principles and ethics of many firms.


18. Reputational Risk: The risk of bad publicity or negative relationships with employees, customers, partners, counterparties and regulators. Reputational risk can be a serious threat to the assets of a firm.


19. Financial Risk: Risks to the financial health of your firm. For example, the risk that you'll be unable to raise sufficient capital to fund operations.


20. Systemic Risk: The risk of collapse of the global financial system or the financial system of a country.


21. Political Risk: The risk that the political environment will turn hostile to your firm.


22. Force Majeure: A catastrophe such as a act of nature or war.


Operational Risk:


The Basel II Committee defines operational risk as: "The risk of loss resulting from inadequate or failed internal processes, people and systems or from external events."  According to Simplicable, Operational risk is the chance of a loss due to the day-to-day operations of an organization.  Every endeavor entails some risk, even processes that are highly optimized will generate risks. Operational risk can also result from a break down of processes or the management of exceptions that aren't handled by standard processes.  It should be noted that some definitions of Operational risk suggest that it's the result of insufficient or failed processes. However, risk can also result from processes that are deemed sufficient and successful. In a practical sense, organizations choose to take on a certain amount of risk with every process they establish.  The following are a few examples of operational risk.


1. Human Error: A mechanic leaves a tool inside an jet engine resulting in the blowout of the engine during flight. The aircraft is able to return to the airport but the passengers are shaken, the airline's reputation is damaged, they face a government investigation and the engine must be completely replaced.


2. Information Technology: A critical network device experiences an error that results in a 4 hour outage for the website of an online retailer. Revenue is lost and customer satisfaction declines for the month.


3. Insufficient Processes: The settlement process for an investment bank is only designed for regular market volume. One day there is a market crash and volume on the stock exchanges spikes to 50x normal. The settlement process fails because it involves manual steps and the bank doesn't have enough trained staff to complete the processes in a timely fashion. Customers are impacted as their orders don't show as settled within the regular time. The bank suffers a loss of reputation with its customers and trading counterparties.


4. Process Failure: A customer service process breaks down due to a lack of training. A number of customers who were entitled to refunds according to local regulations are mistakenly told they do not qualify. The customers complain to regulators who launch an investigation of the company. The company faces fines and negative publicity.


5. Quality Risk: An electronics company establishes a quality assurance process that catches 99.99% of defects in their vacuum cleaners. They therefore expect that 0.01% of their products with have a minor defect and they establish a return policy that allows customers to get a replacement product should they discover a problem. The company budgets for such returns in their cost forecasts.


Tactical Risk:


Tactical Risk is the chance of loss due to changes in business conditions on a real time basis.  Tactics differ from strategy in that they handle real time conditions.  In other words, a strategy is a plan for the future while a tactic is a plan to handle real world conditions as they unfold.  As such, tactical risk is associated with present threats rather than long term conditions.  Tactics and strategy are both military terms.  Military organizations primary view tactical risk as the conditions on a battlefield. An army may identify strategic risks before a battle but tactical risks can only be identified as they unfold.  The following are a few examples of tactical business risks.


1. Credit Risk: A bank extends an unsecured line of credit to a large electronics firm based on the company's healthy financial condition. A few months later, the electronics company suddenly issues a press release saying that they have discovered unidentified accounting irregularities that will have a material impact to its financial condition. Based on the press release, the bank identifies the credit line as a high risk account and freezes it before more withdrawals can take place.


2. Market Risk: A farmer reconsiders plans to plant her fields with corn when prices drop on the futures market.


3. Information Technology Risk: A bank uses a series of network routers that are identified as having a major security vulnerability. They quickly develop a plan to address the vulnerability by rerouting traffic and patching the routers.


4. Competitive Risk: A solar panel manufacturer receives news that their biggest competitor is ready to commercialize a new type of solar cell that has a high conversion efficiency and is inexpensive to manufacture.


5. Marketing Risk: An airline releases a new ad campaign with a catchy slogan. A popular internet meme suddenly appears that makes fun of the slogan and suggests that the company has poor customer service. The company quickly ends the campaign in response.


6. Health & Safety Risk: A company is contacted by three employees from the same office location who say that they have been diagnosed with a communicable disease. The company quickly contacts all staff who work at the location to inform them of the situation and to ask that they work from home for the rest of the week.


7. Legal Risk: A concert promoter in Japan receives news that a typhoon is heading towards their outdoor summer event. The promoter decides to cancel the concert despite the high costs of refunding tickets in order to avoid legal risks associated with injuries from the typhoon.


Confused by the definitions and examples?  Don’t worry you are not alone.  It seems that few can agree on standard definitions and examples that are directly related to the level of risk (strategic, operational and tactical) that is being discussed, assessed, defined, etc.  It may seem as if there are as many risk definitions and examples as there are risks, not to mention the 7 billion + people that inhabit our globe.


According to History, Definition and Critique, by Georges Dionne, there are five main risks:


Pure risk: (insurable or not, and not necessarily exogenous in the presence of moral hazard);


Market risk; (variation in prices of commodities, exchange rates, asset returns);


Default risk: (probability of default, recovery rate, exposure at default);


Operational risk: (employee errors, fraud, IT system breakdown);


Liquidity risk: risk of not possessing sufficient funds to meet short-term financial obligations without affecting prices. May degenerate into default risk.


ISO 31000:2009 gives a list on how to deal with risk:


Avoiding the risk by deciding not to start or continue with the activity that gives rise to the risk

Accepting or increasing the risk in order to pursue an opportunity

Removing the risk source

Changing the likelihood

Changing the consequences

Sharing the risk with another party or parties (including contracts and risk financing)

Retaining the risk by informed decision

Risks, Black Swans, Chaos and Confusion – What’s a Person to do?

Recognizing risk and putting it into a context that is meaningful to your organization may be the first step in finding some focus regarding risk and risk management activities.  While we have to live with the regulations and the guidance; we do not have to blankly embrace them.  We can approach the risk spectrum (from identification to ranking) with a broader, and perhaps, more rational focus.

ISO 3100:2009’s first bullet point above says that we should “avoid the risk by deciding not to start or continue with the activity that gives rise to the risk”.  Since this could be a risk in and of itself, creating even greater risk for deciding not to pursue “the activity that gives rise to risk” puts us in the quandary of “decision paralysis” which creates a risk.  So, the issue with this guidance appears to be conservatively flawed.  Wouldn’t it be better to buffer against the risk as you pursue the activity?

And, what about “Black Swan” events?  As defined by Nassim Taleb, author of the book “The Black Swan: The Impact of the Highly Improbable” is:

  • “A black swan is a highly improbable event with three principal characteristics: it is unpredictable; it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random, and more predictable, than it was.”

There is a general lack of knowledge when it comes to rare events with serious consequences.  This is due to the rarity of the occurrence of such events.  In his book, Taleb states that “the effect of a single observation, event or element plays a disproportionate role in decision-making creating estimation errors when projecting the severity of the consequences of the event.  The depth of consequence and the breadth of consequence are underestimated resulting in surprise at the impact of the event.”

ISO 31000:2009, COSO and other guidance do not appear to address this issue at all.  Although ISO 31000:2009 does state that: “Changing the likelihood” and “Changing the consequences” are two ways to address risk.  Since a “Black Swan” is a rare event, changing the likelihood is going to be difficult due to the unpredictable nature of a “Black Swan” event.  Changing the consequences is a valid goal; but this requires a continuity plan that is not incident specific, but rather focuses on a strategy that is “all hazards” based.  “All Hazards” based strategies are philosophically good, however, the reality is that we do not have a clear picture of all the hazards that we face as an organization (i.e., unknown – unknowns).

It is therefore prudent to constantly assess the risk landscape and to be able to link apparently non related aspects to create a mosaic of risk complexity that can be addressed at multiple levels.  In probability theory and mathematical physics, a random matrix (sometimes stochastic matrix) is a matrix-valued random variable—that is, a matrix some or all of whose elements are random variables.

The power of infinite random matrix theory comes from being able to systematically identify and work with non-crossing partitions (Google Random Matrix Theory Prof. Alan Edleman for more information and a helpful figure).

It would be wise to begin to consider random matrix theory with respect to risk identification and assessment.  The complexity we face today with a globalized society is that risks are shared more and more, even though we have less and less awareness of the manner in which risks are shared.  A good example of this is the international supply chain system.  While organizations have their own supply chain; the combination of all organizations supply chains creates an entirely different risk exposure.  Just think in terms of movement in the supply chain.  Shippers (air, rail, ship, overland, etc.) are all trying to maximize their resources from an efficiency perspective.  This has led the shipping industry to build mega-container ships, which require different portage and logistics capabilities.  Shippers are handling multiple organization’s supply chains, moving products to a wide audience of customers.  While efficiency is increased, risk is also greater due to the potential for a “single point of failure” resulting in the loss of the ship and cargoes.

Risk Questions that the Board should ask?

Let’s turn our attention to the Board of Directors.  NACD, Deloitte and other organizations are producing a lot of literature and are providing a lot of advice to Boards of Directors.  Recently, I received an e-mail that posited questions that the Board should be asking.  Below are the posited questions and my responses to each question.  We have to start looking at our flawed mental models and their effect on our risk management focus.  Cognitive biases affecting risk assessment, discounting risks, over emphasizing risks, not seeing the interconnectivity of risk, etc., clearly will affect how risk is perceived and presented to the Board of Directors.

So, here are the questions the Board should be asking (really)?

Are we clear about the company’s risk appetite and is it communicated effectively?

This question is quite ambiguous and falls into the cognitive bias trap.  Who would answer that their organization is not clear about risk appetite and that the organization suffers from an inability to communicate effectively?

What risks does our current corporate culture create for the organization?

Culture, cognitive biases, groupthink, etc. all will create risks for an organization.  But, what about the cultures of other organizations?  Suppliers, Customers, Vendors, Government all impact your corporate culture and you need to be aware of their culture in respect to your risks.

How do we acknowledge and live our stated corporate values when addressing and resolving risk dilemmas?

Ethics, morals and shared values are in a state of deterioration not just in the USA but worldwide.  A recent 60 minutes (6 November 2016) had a segment with a professional pollster who expressed his concern that people do not want to listen and learn; they want to push their perspective and vent their frustrations.  Organizations really need to take a hard look at values and assess how well these values are internalized.

How do we actively seek out information on risk events and near misses – both ours and those of others – and ensure key lessons are learnt?

Let’s face it, on one wants to hear bad news.  And, as a result a lot of “near miss” situations go unreported.  More importantly, a lot of “near miss” situations go unrecognized by organizations and individuals.  We too often fail to capture lessons learned (learnt) and, if we do, we all too often fail to communicate the importance of the lessons.

How do we reward and encourage appropriate risk taking behaviors and challenge unbalanced risk behaviors (either overly risk averse or risk seeking)?

First, what are appropriate risk taking behaviors?  This is situational and not able to be standardized, as much as we would like it to be.  If I take a risk today and then take the same risk tomorrow, the risk will have changed and my behavior, while acceptable in the first instance, may be totally unacceptable in the second instance.  What is balanced and unbalanced risk behavior?  This, again is situational and selective and subject to cognitive biases of the observer and person taking the risk.  

I will not comment on the following FRC Risk Management Guidance Document points as I think that you can get the flavor of where I am going based on the above comments:

  • the  board  must  determine  its  willingness  to  take  on  risk,  and  the  desired  culture within the company;


  • risk management and internal control should be incorporated within the company’s normal management  and  governance  processes,  not  treated  as  a  separate compliance exercise;


  • the board must make a robust assessment of the principal risks to the company’s business  model  and  ability  to  deliver  its  strategy,  including  solvency  and  liquidity risks.  In  making  that  assessment  the  board  should  consider  the  likelihood  and impact of these risks materializing in the short and longer term;


  • once  those  risks  have  been  identified,  the  board  should  agree  how  they  will  be managed  and  mitigated,  and  keep  the  company’s risk profile under review. It should satisfy itself that management’s systems include appropriate controls, and that it has adequate sources of assurance;


  • the assessment and management of the principal risks, and monitoring and review of the associated systems, should be carried out as an on-going process, not seen as an annual one-off exercise; and this  process  should  inform  a  number  of  different  disclosures  in  the  annual  report: the  description  of  the  principal  risks  and  uncertainties  facing  the  company;  the disclosures  on  the  going  concern  basis  of  accounting  and  material  uncertainties thereto;  and  the report  on  the  review  of the  risk management  and  internal  control systems.

Needless to say, we are faced with a lot of quandaries when addressing risk in accordance with the current guidance and regulatory requirements; and new regulations and guidance is being promulgated at a quickening pace.  Staying abreast of all this is challenging and presents a risk in and of itself.  You can readily comply with one regulation, but find that you are in conflict with another regulation.

Concluding Thoughts

I will conclude by offering the following seven points:

  • Recognize that your risks are not unique

Risks are shared by every organization regardless of if they are directly or indirectly affected.  To treat your risks as unique, is to separate your organization from the interconnected world that we live in.  If you have a risk, you can be assured that that same risk is shared by your “Value Chain” and those organizations outside of your “Value Chain”.

  • Whatever you do to buffer the risk has a cascading effect (internally and externally)

Your organization needs to be in a position of constantly scanning for changes in the risk environment.  When you buffer a risk, you create an altered risk.  By altering the risk exposure, your network (i.e., “Value Chain”) now has to address the cascade effects.  The same is true for your organization; as the “Value Chain” buffers risk it is altered and you are faced with a different risk exposure.

  • Risk changes as it is being buffered by you and others

As mentioned above, risk changes, it alters, it morphs into a different risk exposure when you or others do something to buffer against the risk being realized.  Your challenge is to recognize the altered form of risk and determine how to change your risk buffering actions to protect against the risk being realized and your organization not having the right risk treatment in place.

  • Risk is not static

If we look at commodities trading we begin to understand the nature of speed and its ability to move throughout an organization rapidly.  Commodities are complexity personified.  Markets are large (global) in scale and trading is nearly constant (i.e., 24 X 7 – 24 hours a day and 7days a week).  This makes identifying and managing risk a challenge that can be daunting to many.  In many conversations with commodity traders I came to conclusion that their ability to see risk as a continuum, constantly changing, opaque and rapid in its impact creates a mindset of constant vigilance and offsetting of risks.

  • Risk is in the future not the past

During the cold war between the United States of America and the former Soviet Union, there were thousands of nuclear warheads targeted at the antagonists and their allies. The result, the concept of mutually assured destruction was created.  The term was used to convey the idea that neither side could win an all-out war; both sides would destroy each other.  The risks were high; there was a constant effort to ensure that “Noise” was not mistaken for “Signal” triggering an escalation of fear that could lead to a reactive response and devastation.   Those tense times have largely subsided, however, we now find ourselves in the midst of global competition and the need to ensure effective resilience in the face of uncertainty.

We are faced with a new Risk Paradigm: Efficient or Effective?  Efficiency is making us rigid in our thinking; we mistake being efficient for being effective.  Efficiency can lead to action for the sake of accomplishment with no visible end in mind.  We often respond very efficiently to the symptoms rather than the overriding issues that result in our next crisis.  Uncertainty in a certainty seeking world offers surprises to many people and, to a very select few, confirmation of the need for optionality.

It’s all about targeted flexibility, the art of being prepared, rather than preparing for specific events.  Being able to respond rather than being able to forecast, facilitates early warning and proactive response to unknown Uknowns.

I think that Jeffrey Cooper offers some perspective: "The problem of the Wrong Puzzle.  You rarely find what you are not looking for, and you usually do find what you are looking for."  In many cases the result is irrelevant information.

Horst Rittel and Melvin Webber would define this as a Systemic Operational Design (SOD) problem - a "wicked problem" that is a social problem that is difficult and confusing versus a "tame problem" not trivial, but sufficiently understood that it lends itself to established methods and solutions.  I think that we have a "wicked problem".

As Milo Jones and Philippe Silberzahn in their book “Constructing Cassandra: Reframing Intelligence Failure at the CIA, 1947–2001” write, “Gresham's Law of Advice comes to mind: "Bad advice drives out good advice precisely because it offers certainty where reality holds none"” (page 249).

The questions that must be asked should form a hypothesis that can direct efforts at analysis.  We currently have a "threat" but it is a very ill defined "threat" that leads to potentially flawed threat assessment; leading to the expending of effort (manpower), money and equipment resources that might be better employed elsewhere.  It is a complicated problem that requires a lot of knowledge to solve and it also requires a social change regarding acceptability.

Experience is a great teacher it is said.  However, experience may date you to the point of insignificance.  Experience is static.  You need to ask the question, “What is the relevance of the experience to your situation now?”

  • The world is full of risk: diversify


When it comes to building your risk and/or business continuity program, focusing on survivability is the right approach, provided you have thoroughly done your homework and understand what survivability means to the organization.  The risks to your organization today are as numerous as they are acute.  Overconcentration in any one area can result in complete devastation.

  • Recognize Opacity and Nonlinearity

The application or, misapplication, of the concept of a “near miss” event has gained increasing popularity and more importance than it should have.  A risk manager's priorities should be based on recognizing the potential consequences of a “near miss” event, not on determining the cause of the event.  While causality is important, due to today’s complexity and the nonlinearity of events, determining causality can become an exercise in frustration.  Instead we need to focus on consequence analysis and recognize that as risks evolve change begins to occur, collateral factors come into play, uniqueness is created in the way that the evolution of risk occurs.  Nonlinear evolution of risks combine with reactions to events to transform potential risk consequences.

I don't disagree that analysis of “near miss” events can benefit the organization and facilitate progress in reducing risk exposures in the future.  Investigating “near misses” is often hampered by the nonlinearity and opaque nature of the event itself.  Thereby rendering any lessons learned less than helpful in reducing risk exposure and more importantly risk realization consequences.  “Near miss” events will not have exactly the same chain of causality as a risk event that actually materializes into an impact with unforeseen consequences.

Recognizing and analyzing all “near miss” events isn't a realistic option. This is due in part to the fact that we do not experience events uniformly.  A “near miss” to you might be a non-event to someone who deals with similar situations regularly (see my article: “Transparent Vulnerabilities) as the event becomes transparent to them.


About the Author


Geary Sikich – Entrepreneur, consultant, author and business lecturer

Contact Information: E-mail: or  Telephone: 1- 219-922-7718.

Geary Sikich is a seasoned risk management professional who advises private and public sector executives to develop risk buffering strategies to protect their asset base.  With a M.Ed. in Counseling and Guidance, Geary's focus is human capital: what people think, who they are, what they need and how they communicate. With over 25 years in management consulting as a trusted advisor, crisis manager, senior executive and educator, Geary brings unprecedented value to clients worldwide.

Geary is well-versed in contingency planning, risk management, human resource development, “war gaming,” as well as competitive intelligence, issues analysis, global strategy and identification of transparent vulnerabilities.  Geary has developed more than 4,000 plans and conducted over 4,500 simulations from tabletops to full scale integrated exercises.  Geary began his career as an officer in the U.S. Army after completing his BS in Criminology.  As a thought leader, Geary leverages his skills in client attraction and the tools of LinkedIn, social media and publishing to help executives in decision analysis, strategy development and risk buffering.  A well-known author, his books and articles are readily available on Amazon, Barnes & Noble and the Internet.



Apgar, David, Risk Intelligence – Learning to Manage What We Don’t Know, Harvard Business School Press, 2006.


Davis, Stanley M., Christopher Meyer, Blur: The Speed of Change in the Connected Economy, (1998).


Dionne, Georges; Risk Management: History, Definition and Critique (March 2013 – CIRRELT-2013-17);


Edelman, Dr. Alan; Crossing partition figure, Infinite Random Matrix, Random Matrix Theory, MIT Course Number 18.338J / 16.394J


Jones, Milo and Silberzahn, Philippe, Constructing Cassandra: Reframing Intelligence Failure at the CIA, 1947–2001, Stanford Security Studies (August 21, 2013) ISBN-10: 0804785805, ISBN-13: 978-0804785808


Kami, Michael J., “Trigger Points: how to make decisions three times faster,” 1988, McGraw-Hill, ISBN 0-07-033219-3


Klein, Gary, “Sources of Power: How People Make Decisions,” 1998, MIT Press, ISBN 13 978-0-262-11227-7


Marks, Norman; Time for a leap change in risk management guidance, November 5, 2016


Sikich, Geary W., Graceful Degradation and Agile Restoration Synopsis, Disaster Resource Guide, 2002


Sikich, Geary W., "Integrated Business Continuity: Maintaining Resilience in Times of Uncertainty," PennWell Publishing, 2003


Sikich, Geary W., "Risk and Compliance: Are you driving the car while looking in the rearview mirror?” 2013


Sikich, Geary W., "“Transparent Vulnerabilities” How we overlook the obvious, because it is too clear that it is there” 2008


Sikich, Geary W., "Risk and the Limitations of Knowledge” 2014


Tainter, Joseph, “The Collapse of Complex Societies,” Cambridge University Press (March 30, 1990), ISBN-10: 052138673X, ISBN-13: 978-0521386739


Taleb, Nicholas Nassim, “The Black Swan: The Impact of the Highly Improbable,” 2007, Random House – ISBN 978-1-4000-6351-2, 2nd Edition 2010, Random House – ISBN 978-0-8129-7381-5


Taleb, Nicholas Nassim, Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, 2005, Updated edition (October 14, 2008) Random House – ISBN-13: 978-1400067930


Taleb, N.N., “Common Errors in Interpreting the Ideas of The Black Swan and Associated Papers;” NYU Poly Institute October 18, 2009


Taleb, Nicholas Nassim, “Antifragile: Things that gain from disorder,” 2012, Random House – ISBN 978-1-4000-6782-4


Threatened by Dislocation:


Geary Sikich's picture

Thank Geary for the Post!

Energy Central contributors share their experience and insights for the benefit of other Members (like you). Please show them your appreciation by leaving a comment, 'liking' this post, or following this Member.


No discussions yet. Start a discussion below.

Get Published - Build a Following

The Energy Central Power Industry Network is based on one core idea - power industry professionals helping each other and advancing the industry by sharing and learning from each other.

If you have an experience or insight to share or have learned something from a conference or seminar, your peers and colleagues on Energy Central want to hear about it. It's also easy to share a link to an article you've liked or an industry resource that you think would be helpful.

                 Learn more about posting on Energy Central »