Reliability 2.0….. cliff notes

This years Reliability 2.0 meeting in Las Vegas was a great meeting with a bit of bowling, gambling and interesting discussions on topics like Monte Carlo Simulation. (Which seems like a fitting topic for Las Vegas.)  Reliability 2.0 2014 included 2 papers authored and co-authored by Isograph employees. Dr. Gabriel Lima’s (Isograph Partner) paper titled “Practical Models for Decision-Making in Asset Management” . As an economics professor Gabriel was able to give a unique point of view regarding what motivates asset management strategies. Factors such as increased reliability, improve availability, reduced unit cost, reduced risk and improved frequency of failure .  Below is a photo from Gabriel 3 hour training course. If you are interested in the course notes please contact me jhynek@isograph.com .

Reliability20Gabriel

 

Dr. David Wiseman a die hard Liverpool soccer fan, nuclear physicist  and employee of Isograph also presented a paper titled “Monte Carlo Simulation as an Aid to PM Optimization” .  In this paper David clears up the common misconception that massive amounts of data are needed to perform a simulation to optimize your PM intervals.  For a basic simulation all that is needed is the Mean Time To Repair (MTTR), the price of the PM interval and less than a weeks worth of test data. Using this data a recommended PM interval and Cost Benefit Ratio are easily calculated.

Presentation1

We all walked away from the conference with minor gambling losses and a better idea on how to approach our asset management models. For additional information please contact Isograph: 949 502 5919 jhynek@isograph.com , www.isograph.com

Tech Tuesday: Quantitative LOPA with FT/ET

Howdy, folks. As Jeremy has mentioned, this past Friday, April 4th, we hosted a webinar to demonstrate Isograph’s FaultTree+ tool. One of the topics we discussed was how you can use the Fault Tree and Event Tree features of the tool to perform a quantitative Layer Of Protection Analysis (LOPA). This post will serve as a little summary of that meeting, for anyone who was unable to attend.

The first stage of a LOPA might be done externally to a quantitative tool like Fault Tree. The first thing you’d want to do is identify hazards, determine an acceptable risk level for those hazards, and ask what you’re doing to mitigate them. This might have more in common with a Hazop study. Once you’ve identified your hazards and protection layers against those hazards, the next thing you might want to do is quantify it. How often will the hazard occur? How effectively will our layers of protection mitigate the risk of the hazards? Can we objectively rank these risks? This sounds like a job for Fault Tree and Event Tree analysis.

A Fault Tree can very easily be used to quantify a hazard. In fact, that’s the primary usage of the method. By coupling it with an Event Tree, we can find out how well that hazard is mitigated by protection systems. If you’re not familiar with it, Event Tree analysis is related to Fault Tree analysis. It uses a similar quantitative calculation. The difference is that, while Fault Trees examine the failure leading to a hazard, Event Trees examine the consequences following the hazard. Sometimes, when coupled together, they’re called “bowtie events”.

Yeah, it's cool. Bow ties are cool.

Continue reading