Analog Angle Blog

Getting Over My New Integration Fears

Integration isn't new: It's been the game plan of IC vendors since “back in the day.” At first, it was done the obvious way by adding direct support functions around a core, such as providing digital I/O buffers and analog signal buffers to bare-bones ADCs and DACs.

This was hardware integration, plain and simple. You just looked at the block diagram and you knew what the integration was, and what it could do for you. Sure, there was software, but it was used for set-up, initialization, and defining thresholds and limits, but not for establishing the basic operating functions.

But that was then, and times have certainly changed. Integration now has a new approach of using software to tie together different ICs to make them an application-specific set. It's taken me quite a while to accept the software-based integration.

Two examples make this difference clear. Texas Instruments recently announced a programmable DC arc-detect reference solution, the RD-195. It is comprised of standard, available components: a 16-bit ADC, two different op amps, and a Piccolo microcontroller with applications code. You can edit and tailor the specifics to the kind of arc about which you are concerned (there are many different ones, depending on the installation) and the noise environment, to minimize false detection while ensuring true positives detection; you can also trade off some performance attributes for power consumption. It's the code that makes this set into what it is, to a large extent.

At the other end of the spectrum is the Multi-Input Isolated Analog Front End (AFE) reference design from Maxim Integrated Products (designated Cupertino MAXREFDES5#), targeted at industrial control and automation applications. If you look at the block diagram of this offering, once again you'll see standard ICs which have linked together to form a design which is tuned to an application's needs. But unlike the TI item, this design pretty much functions on its own. Sure, it needs a processor or FPGA to take the data and make sense of it, but the AFE function is largely defined by the ICs themselves, not the software.

I'm still not 100 percent comfortable with software defined integration, but I know it's the wave of the present and future. Yes, I'm an old-fashioned analog type of guy, who likes to know that the vendor can put the ICs on the table with their data sheet and walk away, you can study the details and see what it all does, and that's that. You won't worry about definition of the functional possibilities via software, no conern that it will be redefined later, and no fears that the next software revision will be the one that really, finally gets all the ICs to play together. Hardware has a type of certainty I like, as the IC block diagrams and schematics tell the tale.

But I also realize that the apparent weakness in software-based functionality is also a strength. It's flexible, it's adaptable, and it's configurable far beyond anything that hardware allows. It can adapt to market needs and application re-definition even after the product ships or is installed.

So I'll have to go with the flow and get used to it. But are you OK with software defined integration? Or does it leave you a little uneasy? And if so, why?

Related posts:

7 comments on “Getting Over My New Integration Fears

  1. Steve Taranovich
    April 12, 2013

    I'm with you Bill—I'm an analog guy for 40 years, but I have seen the power of software integration into the Digital Power realm. At first I was a skeptical analog “weenie”. Later, I came to see the magic that software could bring to power management, like adaptive load capabilites that tremendously improve efficiency is a power system with a time-varying load.

    The digital and software integration nowadays is very analog-user friendly with GUIs that do all the software “magic” for us analog types.

    Can't teach an old dog new tricks? Not so—-woof I say—-software is not so formidable anymore—the software guys and gals, with their propeller beanies, have made it very easy to accept. And as a bonus, they have improved and added to our design bag of tricks.

  2. Bill_Jaffa
    April 12, 2013

    I understand how that would be a good idea “on paper”. But how does it play out in relaity? Many designs need a complciated mix of analog parts: a low-bias input for a sensor, a higher-drive op amp for a load, some mix of A/D speeds and resolutions, and so on, That's why most ofthe analog venodrs have so many parts in their roster–different parts of the same application need widely differing performance in speed, input characteristics, output, bandwidth, distortion, linearity, AC specs, and DC specs. It's a jungle!

  3. Bill_Jaffa
    April 12, 2013

    You're right–but only sometimes, IMO. Your comment about the iphone is correct, of course. But I am thinking of data acquisition in medical, industrial, motion control, power measurement, temperature,  strain, optical, RF, and other transducer-based instrumentation applications.

    Yes, the engineering challenge is to figure out what's good enough, but when you have a tough or very specific sensor input, the right op amp and ADC make it work, work right, work consistently, and accurately. Sure, you may be able to compensate with additonal clever circuitry and maybe even software calibraton and tricks, but those often make you presume and assume a lot about the situation and the real world.

    The engneer has to balance many factors, sure, it's all about tradeoffs (power, performance, price) and constraints. But a good engineer also knows when insisting on the right IC for another 50 cents or $1 can make things work properly and consistently.

  4. eafpres
    April 12, 2013

    Hi guys–this isn't pure analog, but I read two articles recently that made me realize how valuable a software configurable part could be.  One was about companies figuring out how to best use TV White Space Spectrum.  A leading company is Neul, and there was an EE Times blub on them:

    Neul article

    The key line for me was “Currently, Neul has a few hundred FPGA versions of its designs, mainly being used for rural broadband trials in the U.S.”

    The other article had to do with a company called Tabula introducing and FPGA with 3D architecture which gives it enough power to handle 100 Gb/s networking. 

    Tabula article

    Basically, these FPGA designs are allowing approaches to be fielded without huge up-front costs.  If more of that becomes available, either analog or digital, it is good for everyone.  Then the designs that are more optimal can go to custom parts, the market gets what it wants faster, and the companies get to market at lower investment.

    April 15, 2013

    Well put Scott. I have seen development of the Chinese going from cheap piece-part to the need of integration. Especially when one adds that the cost of numerous piece parts, plus stocking, plus ordering man-power cost, plus receiving man-power cost, plus labor to load in the different parts, etc. All versus a single higher cost ASIC device, especially when a little bit of software can do what a lot of board real-estate can do.

    I need to open up the programming books and get back into programming to do some product development.

  6. JeffL_#2
    April 16, 2013

    It's interesting to me what inspires the development of some of these application solutions. For example the DC arc detect, I happened to have some background in the problem, the driver in financing the search for answers was largely the FAA's dissatisfaction with the ability of circuit breakers on commercial airplanes to prevent fires in flight on circuits which happened to contain wires with Kapton insulation, which I understand was only installed on aircraft for a brief time back in the 80's before the problem was identified, but once the wiring was in the field it was next to impossible to even locate the problem circuits let alone come up with a cost-effective way to rewire them. Of course now that an answer is available I'm sure it will find other markets, but in fact it's largely a response to what was basically a design error in the first place, which ought to serve as a reminder that as simple an issue as specifying the insulation on a piece of wire can have long-lasting ramifications.

  7. Brad Albing
    April 19, 2013

    Steve – that describes my work + learning curve too. I've seen how digital power can make a supply that has excellent response to changing loads, so now I'm a believer.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.