Thursday, 29 August 2013

21st Century Process Plant Design

I've just heard that Elsevier's board have approved my proposal to write a new textbook for the IChemE on Process Plant Design, which will be called "21st Century Plant Design".

I'm quite excited, it will be the first book I have written.

Wednesday, 28 August 2013

Training: Water and Wastewater Treatment Plant Operation and Maintenance

I've just confirmed that I'm going to Doha in a couple of weeks' time to deliver my Operation and Maintenance of Water and Wastewater Treatment Plant course for a new partner company.

It'll be nice to get away from the murky weather we've been having here of late.

Sunday, 11 August 2013

Expert Witness: Water Engineer: The Meeting of Experts:

I'm getting the reports of the other expert witnesses in the ongoing sewage treatment case next week, in preparation for the "Meeting of Experts". 

The "meeting of experts" is probably more important than a trial, (as most cases do not go to trial, and handled badly they can create grounds for legal action against an expert witness).

Expert Witnesses are no longer immune from negligence claims as we were were before a recent supreme court ruling, so they are a delicate matter.

It isn't just a question of being a technical expert - less than professional behaviour in such meetings has resulted in negligence claims being made against experts.

I've done a few "meetings of experts" now, but this will only be my second since the Jackson Reforms of April this year. 

Sunday, 4 August 2013

To Forgive Design

I've finished Petroski's "To Forgive Design", and very useful he was too in clarifying some ideas we share about the nature of design. To quote the book in summary of its content:
“It is imperative that the realistic prospect of failure be kept in the forefront of every engineer’s mind.”
The history of technology and engineering is littered with failures. When designers ignore this, and focus instead on past successes, they become complacent. Over around a thirty-year cycle they as a group forget the limitations of a novel design technique as its use becomes commonplace, and the technique is pushed beyond its limits, leading to failure.

We learn from our life and professional experience about the limitations of technology, and we retain the information as concepts, to be applied by analogy or metaphor.  A history of failures by means of specific and concrete examples is instructive to the designer, as most of the causes of failure in the real world do not exist in the mathematical world of "engineering science". These causes undercut the assumptions which the theoretician must make to allow mathematical analysis. They may be divided into hardware and software failures, but most contain aspects of both the known and unknown limits of designs and the uses to which they are put.  

Failures are mostly founded in phenomena which are known about, but not well understood - the known unknowns. Our lack of scientific understanding is reflected in the design codes and margins of safety we apply to an always incomplete mathematical model. Knowing the extent to which one is operating beyond one's knowledge should inform a decision about suitable safety margins, but complacency and pressure from management conspire to remove the margins of safety which should reflect the designer's lack of knowledge. Pressure from risk-tolerant management on risk-averse engineers is commonplace, as a recent TCE article by Mohan Karmarkar discusses.

The decision about whether and what to build is not made by engineers - cost, risk, economic, social, aesthetic and political considerations may dominate these decisions, and be fed back to designers as limitations on scope, budget, programme etc.  People get the design which they are willing to pay for.

Materials in the real world do not always meet specification, and testing protocols to reject non-compliant inputs may be compromised. Similarly, construction and commissioning is never carried out exactly as the designer intended, operation and maintenance "develops" from the approach given in the O+M manual, and the plant may well come to be used for purposes other than those for which it was designed.

In recent discussions I have had to do with plant design, a few ideas have come up. They remind me by analogy of other failures which I have witnessed in the wider world, as does the process of the growth of irrational exuberance until catastrophe ensues which Petroski describes.

These failures are referred to as "bubbles", or boom and bust cycles. Many of them match the cycle of "design bubbles" quite closely. The things which people say which tip me off as an investor that an asset class might be subject to a bubble are as follows: 

"You can't lose"- the idea of high, uninterrupted and irreversible growth in the value of anything ignores all that we have learned of the history of economic value. Things which have a high rate of growth in value are in general more risky than those with a low rate. Human nature means that high-growth assets can very rapidly become assets whose high rate of growth is based solely on a bandwagon effect. In the 17th Century, tulip bulbs were subject to this effect, an episode now referred to as "tulipmania", most famously described in Mackays "Extraordinary Popular Delusions..."

"The old rules don't apply" -  The dotcom crash came as no surprise to those who knew about other tech-stock crashes of the past, such as railway mania, and the stock market crash of 1929 (which owed much to bubbles in the prices of then-novel electrical, radio, automotive and aviation technology companies). The idea grows that these new technologies will mean that unlimited profits are available, and that consequently old models of pricing do not apply.  

But you can always lose in any game worth playing, and 21st century people are just people, whose natural inclinations are just as they always were. Greedy overconfidence is followed by disaster, followed in turn by the forgetting of the cause of disaster, and the substitution of wishful thinking for rational analysis which restarts the cycle. Unless we act in a way which does not come naturally to humans and remember the past, we repeat it.

Another idea which has come up in several forms is that design based on computer models so complex that their users cannot fully understand the provenance of their outputs is so much better than old methods that margins of safety can be cut to the bone, and management should not come back to operational staff to ask for debottlenecking and optimisation exercises.


Clever computer programmes devised by highly numerate engineering graduates turned finance wonks were in large part responsible for the depth of the most recent crash. Some of these programmes in fact attempted to apply versions of physical science and engineering formulae to financial calculations. Economics is not only not a science, it is irrational nonsense, founded in obviously false axioms of rational behaviour by humans as a mass. However, betting the world economy on substituting an equation from a completely unrelated field into an automated stock trading programme is less rational still.

As soon as people start telling me that they can use the shape of stock market price graphs to predict where they will go next, I smell cow. Similarly, anyone who hears claims that computers can do anything other than handle complex but essentially dumb tasks such as pinch analysis better than people, they should walk away. Computers are not creative, and engineering is a creative activity. Therefore computers cannot engineer.

The cleverest computer simulation in engineering is just a model based on applied maths and science which does not fully describe the thing being modelled, and the best software written has 4% errors. Having more faith in such models than the professional judgement of a group of experienced engineers is foolish.

Petroski notes a way in which the focus on the very recent past of professional researchers, and a move away from paper to electronic records made it difficult for him to fully explore the history of failure. Mohan Karmarkar's TCE article points out that histories of Chem Eng accidents on the internet seems to only go back 20 years, and the IChemE's Accident Database is now both out of date and hard to obtain. The only useful records of the limits of design seem to be in the minds of highly experienced designers, so they cannot be programmed into simulation programmes. 

I do however have sympathy with the idea that management should not come back to operational staff and ask for things on the plant to be tightened still further, not because a simulation-informed design is better, but because the idea that it is means that safety margins have already been cut in the design process.

Design starts as in the minds eye, as Ferguson notes, just as it has for thousands of years. That starting concept comes not from science or mathematics, but from the designers ingenuity, experience and ability to see analogies.

All that has changed to allow technology to progress is that the designer's mind's eye envisages concepts based on a more sophisticated state of the art, more powerful tools have been devised to allow us to winnow good options from bad, and smarter information storage devices have been produced to allow us to better learn from our past mistakes.

Engineering is in essence the same activity as it was before it was even known by that name. Unlike the professional researcher who needs to give only the most recent references, designers can go back to the ancients, as Petroski does to great effect.

Friday, 2 August 2013

From Theory to Sharp Practice: Competitive Bidding

My intern has been learning this week about a few practical issues in pricing, as a result of sending out enquiries for a package sewage treatment unit and some associated plant for what will be the final resolution of the Indian Restaurant job. 

Non-compliant bids came back from more than half of the potential suppliers. Bids were made with less than the specified capacity, for less than the required scope, and for other then the requested terms. The offer letters from the worst offenders were full of phrases which attempted to make the potential purchaser responsible for ensuring that the offer was  suitable for the duty to which they intended to put it, even though the bids were counter-offers which ignored the basis which the client asked for.

I can tell whether the alternate basis implicit in the offers is is likely to be valid, but how about the guy who owns an Indian Restaurant, who tries to buy one of these plants directly? Is it fair and reasonable to offer him something on the basis that he is an expert purchaser?

I'm not sure that it is - I'll have to see if there have been any cases where this has been tested in court. I'm involved as an Expert Witness in two cases at the moment where this might come up as an issue. Unless these suppliers always fold before getting into the courtroom, it seems likely that these terms have been the subject of a legal ruling.

As is often the case, when we had adjusted all of the bids to reflect what had been left out, what had been the cheapest bid was the second most expensive, despite being undersized. Buyer Beware!

Our bid is in with the client now, and it is almost four times the unadjusted low bid received, though around the same as the highest bid received despite including a good deal of consultancy and site time, and a number of key items missing from the cheapest bids. Let's see if the client has learned from his experiences to date that he who buys cheap, buys twice....