The pace of change in technology, especially in electronic systems, is so rapid and relentless, that we rarely get a chance to pause and look at the big picture. We have experienced such a cascade of smart, mobile, cloud-enabled products in recent years, that the longer-term patterns in design are not always clear. It is worthwhile, however, to look briefly at the longer arc of history in electronic design, from the emergence of radio and telephone technology to today, and to anticipate the spread of machine learning and artificial intelligence into our daily lives.
At the risk of oversimplifying a rich tapestry of invention, productization, economic transformation and dead-end developments, we discern three waves of essential electronic design, and the onset of the fourth, as shown below. Each successive wave does not replace the prior dominant design technology, but builds on top it.

The first wave is analog circuits, starting with vacuum tube technologies found in early radios, television and radar in the 1930s and 40s but becoming fully levering transistors as they came along, first as discrete devices, then in ICs. Today, analog circuits are crucial important in electronic design, with increasing IP reuse as a basic design method for leveraging analog expertise.
The second wave, naturally, is digital design, fully emerging in the 1960s, with discrete transistors, and then TTL components. In the VLSI era, design transitioned to RTL to gain productivity, verifiability, portability and integratability in system-on-chip. Today, large fractions of the digital content of any design are based on IP reuse, as with analog circuits. The remarkable longevity of Moore’s Law scaling of cost, power and performance, has driven digital designs to extrarordinary throughput, complexity and penetration in our lives.
The third wave – processor-based design – really started with digital computers but became a widespread force with the proliferation of the microprocessor and microcontroller in the late 1970s and 1980s. The underlying digital technology scaling allows the processors grow by roughly one million fold in performance, enabling the explosion of software that characterizes the processor-based design wave. Software has move inexorably from assembly language coding, to use of high-level languages and optimizing compilers, to to rich software reuse in processor-centric ecosystems, especially around specific operating systems, and to the proliferation of open-source software as a major driver for cost-reduction, creativity and standardization in complex software systems.
We are now on the cusp of the fourth wave – cognitive computing. The emergence of large data-sets, new hardware and methods for training of complex neural networks, and the need to extract more insight from ambiguous video and audio, all have helped drive this fourth wave. It will not replace the prior three waves – we will certainly need advanced design capabilities in analog, digital and processors-plus-software, but these will often be the raw building-blocks for constructing cognitive computing systems. And even when deep learning and other cognitive computing methods form the heart of an electronic system, these other types of design will play complementary roles in communication, storage and conventional computing around the cognitive heart. The acknowledgement of the power of cognitive computing is a very recent development – deep neural networks were an obscure curiosity four years ago – but we can anticipate rapid development, and perhaps dramatic change. In fact, it seems likely that many of today’s hot network structures, training methods, data-sets and applications will be obsoleted several times over in the next ten years. Nevertheless, the underlying need for such systems is durable.
Archeologists understand that the proliferation, economics, and even culture of a community is often driven by the characteristic tools of the group. The variety and usefulness of electronic systems is inevitably coupled to the availability of design tools to rapidly and reliably create new systems, In the figure below, we show a few of the key tools that typify design today in the analog, digital and processor-based layers.

The cognitive computing community fully appreciates the need for robust, easy-to-use tool environments, but those emerging tool flows are often still crude, and rarely cover the complete design development cycle from concept and data set selection, to deployment, verification and release. It seems safe to predict that major categories will cover training, network structure optimization, automated data curation, with labeling, synthesis and augmentation, and widespread licensing of common large data-sets. In addition, we might expect to see tools to assist in debug and visualization of networks, environments for debug and regression testing, and new mechanisms to verify the accuracy, robustness, and efficiency of training networks. Finally, no complete system or application will consist of a single cognitive engine or neural network – real systems will comprise a rich mix of conventionally programmed hardware/software and multiple cognitive elements working together, and often distributed across the physical environment, with some elements close to myriad sensors and others deployed entirely in the cloud. We can easily see the eventual evolution of tools and methods to manage those highly distributed systems, and perhaps relying on data-flows from millions of human users or billions of sensors.
So the fourth wave seems to be here now, but we cannot yet hope to see its ultimate impact on the world.