Here’s something I noticed a while back: how often our problems boil down to optimizing the peak-to-average ratio. This might be flirting with madness, but now I see this dilemma popping up in other aspects of life.
I should explain. Let’s say we’re selecting components for a power supply design. You have nominal input conditions and output requirements. For argument’s sake, let’s say the design typically requires 10 watts. That’s what you want to design for, and if you hit it exactly, then you’re not wasting money by providing capacity that is rarely used. Here’s the problem. Suppose every Thursday when the moon is full, the load demands 14 W. If you design for 10 W and are asked to deliver 14 W, then the power supply is under stress and might fail. An engineer developing a reputation for creating reliable designs would fight the battle and pay the extra price for circuitry that supports 14 W and might even go further. If you know you might sometimes need 14W, you might design to support 15 W to anticipate cases beyond what is expected. Good engineer. Reliable design. The boss says, “You spent $0.15 when you should have spent $0.10. You’re a bad engineer.”
Then you have to account for all the other variables you can capture. What are the environmental extremes? What are the input extremes? What else can go wrong?
To me, the tradeoffs boil down to how much it’s worth to achieve excellence. Every company will tell you it wants to be excellent, but is it mature enough to pay for it? To carry on with the power supply thought experiment, I’ve worked with the biggest semiconductor company in the world, and it drew an interesting conclusion from studying the peak-to-average ratio. It wanted the safety of the 15 W rating but then said: Look at the capability between 10 W and 15 W. Why aren’t we taking advantage of the excess capability? What design strategy gives us more access to capacity we’re paying for but not using? Maybe those guys are smarter than they look.
How does this analysis play out in other areas? For example, in the engineering lab, you might have a piece of equipment that gets used once a year. From a customer perspective, a problem or test requirement pops up: Having the tool on hand and readily available means you don’t have to hunt for it, or beg someone to loan it to you, or rent it, or whatever. There’s an even worse scenario, and that’s when the engineer doesn’t want to face the hassle and skates around the issue. A company should never make it hard to do a good job.
Imagine taking your car to the shop. If it needs a specialized tool to do something, you won’t notice how rarely the tool is used. You don’t care. You’ll notice the quick response to solving your car’s problem. That’s excellence.
I expand this to managing people. You might have an adequate employee who is perfectly capable of handling the normal day-to-day job, but what happens when you’re faced with an extraordinary challenge? To be excellent, underutilize the talent on a day-to-day basis so, when the crazy thing happens once a year, you’re able to expertly deal with it. The MBAs won’t get this, because it looks bad on paper. You could have an employee for $50,000 a year and you’re paying $100,000. Bad manager.
I’m not saying we should throw money away. There is an art to striking a practical balance in the peak-to-average ratio.
Once you notice grappling with the peak-to-average ratio, it pops up everywhere.
How excellent do you want to be? Optimize the peak-to-average ratio to get there.
What are your thoughts? Set me straight in the comment section.