Reliability Assessment: A Guide to Aligning Expectations, Practices, and Performance. Daniel Daley
comprehensive assessment of “what you have a right to expect” is different from an audit of your current reliability and maintenance programs. It is an evaluation of the effectiveness of all the elements important to reliability in the context of the inherent reliability of your current systems.
Using the example of an automobile, a concerned father may be willing to pay for the expectation of high reliability for his daughter’s vehicle by purchasing a new car for her. She has few of the other characteristics leading to high reliability (knowledge of how best to operate, maintain, or inspect it), but a new reliable vehicle can reasonably be expected to overcome those weaknesses.
Transferring the analogy to a system or piece of equipment, few of us have the luxury of replacing an item when it begins to age. In other words, we cannot “buy” reliability the way the protective father did. In most real-life cases, we have a right to expect only the level of reliability justified by our:
•Good operation
•Sound maintenance
•Thorough inspection
•Thoughtful renewal practices
In order to have a realistic assessment of “what we have a right to expect,” we must assess our expectations in light of inherent reliability as well as all other choices made over the life of the system or device.
I have always depended on the kindness
of strangers.
Tennessee Williams, A Streetcar Named Desire
The longer I remain involved in the reliability business, the more examples I find of individuals who have little or no idea how reliability works. I have started writing this chapter to describe this problem any number of times, but have been rebuffed by how it sounds when I read it back. I erase what I have written and go back and try to rewrite it in a more positive way. Then when I get through with the rewrite, it sounds more positive. However, it does not then adequately describe the issue. Maybe it is best if I just begin with an apology about the negative tone of this chapter, and then highlight the fact that while the description sounds negative, the problem can be corrected.
Generally speaking, people take reliability for granted. As described in the quote at the start of the chapter, when it comes to providing reliable systems, many people depend on the “kindness of strangers.” They do so even when it seems foolish to do so. Although reliability saves money over the long haul, it costs more in the short-term. Systems that have a more reliable configuration (e.g., redundancy) and contain more robust components have a higher first cost. If you buy based only on first price, you should not expect that a kind manufacturer will enhance reliability by including better features at no added cost.
The saying “caveat emptor” or buyer beware precedes the complexity of current technology by millennia. Despite that fact, some complex systems are purchased using minimal specifications and few if any pre-acceptance inspections. This approach to purchasing seems just another example of a situation in which someone believes that others will look after their interests. That may be acceptable when dealing with family or friends, but is certainly naïve in business.
Let’s be explicit:
•Many people think they can purchase any device using any method to specify it and any level of acceptance testing to ensure they receive what they expect.
•They believe the device can be operated in any manner and the reliability will be the same.
•They understand little of the relationship between maintenance and reliability.
•They believe they can change a system in almost any manner and the change will have inconsequential effects on reliability.
Or, at least, it seems.
You may be saying to yourself, this author is exaggerating so he can make his case or sell more books. I wish I were. However, I can cite any number of examples of situations where experienced individuals have made choices that indicate they understand little about reliability.
•A major transportation company purchases equipment with the most meager of specifications — these specifications do not address standards for fasteners, materials, and components.
•A major durable good manufacturer purchases key components from third world countries without specifications for metallurgical content, heat treating, or quality control.
•A senior executive of a major oil company views regular inspection and equipment integrity programs as being voluntary and cuts them from annual budgets.
•Another senior executive at a major chemical company cuts the external paint program at a Gulf Coast plant, then professes not to understand the relationship between painting and external corrosion when there is a leak of toxic materials.
•Refinery and chemical plant executives chronically profess not to understand the relationship between emergency, short-cut repairs and continually degrading reliability of that equipment.
•Project managers for new plants, plant modifications, and new durable products chronically are allowed to develop designs without addressing reliability, availability, or maintainability requirements.
•Shop managers for a major transportation shop, when ound to be cancelling or deferring work, profess not to understand the relationship between predictive or preventative maintenance and the reduction of equipment failures.
It is indeed unfortunate that people in key positions do the kinds of things described above, but these are all actual examples. I doubt it would sell, but it would be possible to assemble an entire volume of similar examples. If you are an individual who is active in the reliability business, you probably have a myriad of examples of your own. After dealing with such examples for so long, you begin to become cynical and view these paradigms in one of two ways:
1.The individuals know better, but it is inconvenient for them to admit it. Were they to admit understanding what is proper, they would have to do the right thing. Feigning misunderstanding relieves them of doing the right thing.
2.It is possible for dumb people to rise to positions of responsibility and authority. Maybe a kinder way to say this is that people in positions of authority are intelligent enough to know how to deal with everything else, but reliability is too complex for them.
I personally have a difficult time accepting the second explanation. I believe it is the responsibility of reliability professionals to do everything in their power to inform individuals in responsible positions about reliability and how it works so they can no longer hide behind naivety. There should be no reason for naivety.
In addition to performing the actions needed to address reliability needs, reliability professionals need to find the ways to articulate issues so decision makers understand the impact of their choices. Clearly, there are times that decision makers need to make difficult choices in the face of overwhelming current business needs. On the other hand, they should not make such choices in a vacuum.
They need to understand both the immediate risks that those choices introduce and the long-term lifecycle costs they will cause.
Each and every step in the lifecycle of a system has an activity that can result in enhanced reliability or, conversely, an inaction that can result in reduced reliability. I find that there are a great many analogies between the lifecycle of an electrical or mechanical system and the life of a human being. In the same way that poor life style choices can impact the life of a person, poor choices can reduce reliability and shorten the life of a physical system.
If individuals choose to smoke for a portion of their lives, it is possible to stop smoking and minimize the negative impact. But it is impossible to completely reverse the effects of