Integration isn't new: It's been the game plan of IC vendors since "back in the day." At first, it was done the obvious way by adding direct support functions around a core, such as providing digital I/O buffers and analog signal buffers to bare-bones ADCs and DACs.
This was hardware integration, plain and simple. You just looked at the block diagram and you knew what the integration was, and what it could do for you. Sure, there was software, but it was used for set-up, initialization, and defining thresholds and limits, but not for establishing the basic operating functions.
But that was then, and times have certainly changed. Integration now has a new approach of using software to tie together different ICs to make them an application-specific set. It's taken me quite a while to accept the software-based integration.
Two examples make this difference clear. Texas Instruments recently announced a programmable DC arc-detect reference solution, the RD-195. It is comprised of standard, available components: a 16-bit ADC, two different op amps, and a Piccolo microcontroller with applications code. You can edit and tailor the specifics to the kind of arc about which you are concerned (there are many different ones, depending on the installation) and the noise environment, to minimize false detection while ensuring true positives detection; you can also trade off some performance attributes for power consumption. It's the code that makes this set into what it is, to a large extent.
At the other end of the spectrum is the Multi-Input Isolated Analog Front End (AFE) reference design from Maxim Integrated Products (designated Cupertino MAXREFDES5#), targeted at industrial control and automation applications. If you look at the block diagram of this offering, once again you'll see standard ICs which have linked together to form a design which is tuned to an application's needs. But unlike the TI item, this design pretty much functions on its own. Sure, it needs a processor or FPGA to take the data and make sense of it, but the AFE function is largely defined by the ICs themselves, not the software.
I'm still not 100 percent comfortable with software defined integration, but I know it's the wave of the present and future. Yes, I'm an old-fashioned analog type of guy, who likes to know that the vendor can put the ICs on the table with their data sheet and walk away, you can study the details and see what it all does, and that's that. You won't worry about definition of the functional possibilities via software, no conern that it will be redefined later, and no fears that the next software revision will be the one that really, finally gets all the ICs to play together. Hardware has a type of certainty I like, as the IC block diagrams and schematics tell the tale.
But I also realize that the apparent weakness in software-based functionality is also a strength. It's flexible, it's adaptable, and it's configurable far beyond anything that hardware allows. It can adapt to market needs and application re-definition even after the product ships or is installed.
So I'll have to go with the flow and get used to it. But are you OK with software defined integration? Or does it leave you a little uneasy? And if so, why?