"Away from goodness" in engineering?

This is not a car question. It’s more of an “engineering culture” question. And I don’t mean in the “safe” and well-known arenas where things are well-known, tried and true, and within known bounds. I mean in “pushing the envelope” areas like launching space craft - where people are still trying to work things out. Like where building and operating stuff is still somewhat experimental and operating between some set of knowns and other sets of “in - theory, uh, we think…”

I just ask this rather off-topic question here because I know there are quite a few savvy and experienced engineering types around.

The simple-ish question - back during the space shuttle days people were pushing the envelope with many things (and not just with sealing solid rocket booster segments). Many of those things were not particularly well-understood - somewhere between the known and the theoretical. (Well, and the complexity of the entire system, taken as a whole, was rather gnarly).

One phrase, used by at least some of the shuttle engineers, that captured the ambiguity and uncertainty was, more or less - it’s “away from goodness” or “toward goodness.”

I’m just wondering whether or not that is a typical engineering culture thing or if it was more of a space shuttle program engineering culture thing. (And maybe then, even just a Morton-Thiokol thing in working with solid rocket boosters).

If you do search the phrase online, virtually all of the results returned are space shuttle Challenger related. But if it’s just an informal thing in engineering culture, I don’t find that to be too surprising. It’s not going to make the textbooks or whatever. It would have only been made famous in the context because of the tragedy. So, you know, just looking for the insights of those who know engineering culture.

I think you’ve got the whole culture at NASA wrong. What they do is filled with risk and they try very hard to reduce that risk. Why do you think things are so expensive?

But NASA is overseen by a bunch of politicians in Congress who not only don’t understand, but answer to people who don’t understand, aren’t interested, and only look at how much money is being spent.

In spite of that, NASA has had many successes: The Mars Rovers, Voyager 1. These things lasted way, way longer then designed to. Unfortunately they have had some spectacular failures.

But to address your question: Engineers report to people who are running companies whose purpose is to make money, This necessitates compromises. None of them purposely designs or operates things that are dangerous, but the trick is balancing the risk vs reward. The biggest problem is that the way the world works is only partially understood, and there is always a risk that things will go wrong.

We do the best we can within the bounds we are given.

2 Likes

Cost is always a consideration, one has to come up with the best answer/design within that limit. John Glenn said something about his nerves from ‘knowing every part on the Mercury rocket had been made by the lowest bidder’. That’s why standards and reviews are important.

The Challenger disaster resulted from management failure to listen, not from engineering.

1 Like

I agree with the others. Seems like a NASA thing. That said, the “away from goodness” phrase sounds like “manager speak” to me.

“Manager speak” is the language engineers use to explain their design progress to people who consider calculating tips to be difficult math or who have not had a science class since freshman year of high school. Dumbing the engineering down so even a labradoodle could understand it. In NASA’s case, this would be politicians. In the private sector, this might be the director of human resources, legal or sales.

NASA has its culture that likely differs from for-profit companies. The culture of any business, let alone engineering department varies quite a bit.

I worked in an industry concerned about safety - cars - on products that did not compromise safety but improved the customer experience - fancy shock absorbers. The culture was one of free innovation because failure of the product was not likely to kill someone.

My next company made products for an industry that saw people injured nearly every day - forklifts. Innovation was not free at all. It was very constrained because we took very seriously the potential that our machines could hurt or kill someone. This created a very conservative culture. This company avoiding being the first to offer the innovation. They were the 2nd to market with an improvement to their competitor’s innovation.

Most of the instruments and satellites that NASA works on are onesies. If there is a failure, then the mission is degraded. There are no second chances. The engineers want to hand over a fully functioning system once it arrives on station so that the scientists have several years of mission operations to study the intended phenomena.

The high cost isn’t so much in construction as it is in testing. Each subsystem is tested many times as they are built and integrated into the final product. IMO there is also extra design, construction, and test evaluation because of Congress. It’s fun and profitable for Congressmen to destroy program manager’s lives when things go wrong. The extra work is risk mitigation to ensure that delays in never-done-before systems don’t destroy NASA lower level program level manager’s careers.

Since this started mentioning the Challenger it was launched against the advice of the engineers:
The night before the launch, Bob Ebeling and four other engineers at NASA contractor Morton Thiokol had tried to stop the launch. Their managers and NASA overruled them.

To my knowledge no NASA manager or political operative ever confessed as to why the engineers were overruled.

1 Like

Away from Goodness Is that an actual statement used by anyone ?

I’ve worked in a quite a few industries and corporate cultures. Designing everything from industrial controls to medical electronics to stuff launched into space. I worked closely with engineers at both Lockheed and NASA. Even got to crawl around in the full fidelity mock up of the shuttle where they train the astronauts. I never heard them use that phrase nor have I heard it used elsewhere. A more common alternative is using references to the 4 quadrants of knowledge- the 4th being; what we don’t know, we don’t know :slight_smile:

One thing not discussed much and perhaps somewhat hidden from public view is all the testing that goes into these programs. More like baby steps than grand risk taking adventures. There is always the unknown but they do all kinds of testing to try and understand the environment before they put people in a capsule and hurled them into space (for example). :grinning:

The statement of work for a program I was involved in came in on a pallet truck…

I don’t remember the exact situation anymore but the problem was the temperature dipped lower than the engineers were comfortable with I believe. Had it not been so cold, the o ring would have sealed and no problem I believe. Then throw in the less robust design in the first place that maybe didn’t have as great a range of reliability as it could have.

At any rate when the vendor is on the line to perform, and you have to decide to go or not go based on theory, instead of facts, it was a tough spot to be in. So do you have faith in the engineers or not to make the big decision?

I think what really came out of the whole thing was the whole leadership structure prone to failure and pushing the envelope beyond wisdom. A business model that sooner or later was bound to fail.

The engineers said ‘don’t go, it’ too cold’. They were overruled.

3 Likes

Thanks. That’s pretty much the direct answer I was asking for. I just do some work in the 4th quadrant along with the 3rd (what we know we don’t know). The shuttle SRB joints were some of both.

And LOL, I should know better, OF COURSE. But I wasn’t trying to start up a convo on the Challenger disaster. I’m quite well-versed on it. The “engineers were overruled by management” line is not technically wrong in any way. But it does greatly oversimplify the larger context of the SRB program history, the decision-making procedures used in launch decisions, and the famous night-before-the-launch teleconference where the engineers were (sort of) overruled.

It’s likely that the best account reconstructing what went down is Diane Vaughan’s, although the use of the “deviance” term is a little bit dicey even by her own account. But it is still the most common way that people think of it, whether they think the deviance was on the engineering or management side or both.

And if you go by the Rogers Commission finding (the official presidential one), it was BOTH engineering and management failure.

And yes, @VOLVO-V70, “away from goodness” was an actual phrase used. I just didn’t know if it was idiosyncratic to the case or was a more general thing. That’s what I was asking about.

2 Likes

I never heard that phrase in any of the Silicon Valley Tech companies I was involved with. “Risk management” was the more common pertinent phrase. “Schedule risk” probably next in line. NASA & plane-jane tech product development companies, they can’t really be compared, apples & oranges. Commercial (passenger) big-tech airplane development is closer.

It more or less comes from risk that can’t be quantified (yet?). Engineering risk analyses are generally pretty hard-nosed and quantitative. But when you push the envelope you can end up in ambiguous spaces. The shuttle program was like that in many respects. The whole “temperature and o-ring” thing for the shuttle solid rocket boosters (SRBs) couldn’t be quantified on the night before the Challenger launch. The language used (and I think generally-speaking in the SRB program) was sort of like “we can’t say for sure, but we know that colder is away from goodness”…

1 Like

I saw a movie/fictional documentary about that incident. It said there was at least one Engineer at MT who objected to the planned launch b/c of the ambient temperature. He definitely wasn’t ignored, but he couldn’t say “no” with enough certainty, s the decision to go ahead w/the launch eventually won out. Safety vs project objectives is always a battle in high risk projects like that. What would you have said if asked whether to continue w/the Apollo 11 moon landing, given that the lander was higher than planned, going too fast, and the landing computer was spewing error messages? It’s a good thing I wasn’t making the decision, b/c with all 3 problems, I wouldn’t have been able to say “go ahead”.

We’ll pick a subject. If your company contract is on the line, your reputation, your own uncertainty, your job, career and pressure from the bosses all on the line, you might say go anyway. Sometimes you are right and sometimes not. Raise your hand if you never pressured someone to give you the answer you need. I’ve seen it both ways. Takes a lot of guts going up against an institution. Then comes along a boss that says let me make the decision and take the heat so you dont get blamed. Organizational culture.

It probably wasn’t fictional. But if there is one, I’d like to know of it. There are umpteen documentary treatments - some better than others. The “one engineer” was probably Roger Boisjoly, but generally speaking, once the MT engineers saw the weather data they recommended against launch. But they, in fact, could not quantify their concerns. It’s complicated.

The real “mistake” was probably trying to treat the shuttle program as “routine.” So routine, in fact, that they were sending the “school teacher in[to] space.” It should have remained what space missions generally have been: risky things where the people put at risk are professional astronauts understanding the risks. The “mistake” was not about temps and o-rings.

Witness the latest crew of the recently aborted (for now) Boeing Starliner launch. Two seasoned vets and astronauts who have spent their careers putting their lives on the line and knowing the risks. It’s almost 40 years later and we still don’t know if something will make one of those things blow up. And it’s not because engineers are stupid. It’s because there’s a lot we don’t know - and some of it we know that we don’t know. And some of it we don’t even know that we don’t know. (Thanks again @TwinTurbo ).

2 Likes

Very relevant to the shuttle program. But one of them was going to blow for some reason, no matter what. (And, in fact, two of them did).

When you talk about “space” programs, these are politically initiated. The engineers suggested robotic space exploration for cost, safety, and also because humans are incapable of surviving anywhere off of this planet, but the politicians said that the public would not put up with the costs of space exploration unless human astronauts were involved.

The decadal survey produced by the National Academies for Science, Engineering, and Medicine is the start for setting NASA’s priorities. Scientists read the survey and try to tailor their research proposals to the decadal survey. NASA then reviews the proposals and decides which ones to fund.

1 Like

I’ll just say I think about everything involving public money and the need for a national cohesion is political. Now I remember sitting in front of the tv and watching the horror of the Russian launch of Sputnik. Then jfk sensing the public, launched the mission to the moon. It was political but also national leadership and national goal setting. He even had high school kids taking 50 mile hikes to get in shape. That was leadership, political or not. I don’t think it was the result of any science report because it was an unheard of thing to do.

1 Like