tech

Will artificial intelligence subvert the EDA industry?

Since the transformation in the Electronic Design Automation (EDA) field has spanned decades, artificial intelligence may alter the semiconductor development process and compel changes in chip design.

Generative artificial intelligence has already upended the search domain and is reshaping the landscape of computing, now it also claims to revolutionize the EDA sector. However, despite the buzz and proclamations of imminent radical changes, it is still unclear in which areas artificial intelligence will have an impact and to what extent these changes will be profound.

EDA primarily serves two functions: automation and optimization. Many optimization problems are NP-hard, meaning that optimal solutions cannot be found in polynomial time, especially as the scale of design expands. Over time, heuristic methods have been developed that can achieve "good enough" results within a reasonable timeframe. While it is conceivable that artificial intelligence can provide results that are comparable or even closer to optimal, this might be more of an evolution than a disruption for design.

Advertisement

Disruptive innovation often leads to market changes. A hypothetical question might be: "If EDA could provide optimal results in zero time, how would the semiconductor industry be affected?" Time to market would be accelerated, and designs would have slightly better performance, power consumption, and area (PPA). However, whether this is sufficient to significantly increase the number of design initiatives or open up new markets is currently unclear.

Under these hypothetical conditions, design creation and verification would still be the limiting factors. Generative artificial intelligence may be able to improve this, and there are encouraging signs that it can. If the time for design and verification is significantly reduced, it will almost certainly create new markets.

Over the past few decades, the EDA field has also experienced disruptive changes, but the issue is that these changes often become apparent only after they occur. "In some cases, people knew that change was coming, just like Kodak knew about digital printing, but they just couldn't bring it to market," said Prith Banerjee, Chief Technology Officer at Ansys. "Innovation has three levels. The first level is short-term innovation. What features should the next version of the tool have? We know these features because they already exist in the market. You are pitching to the market, you are watching competitors—about 70% to 80% of the investment within large companies is focused on the first level."

The "second level" involves adjacent fields. "For example, you are selling a product designed for field deployment and want to move it to the cloud," Banerjee added. "Innovation is necessary, but we will figure it out and succeed."Many disruptive changes based on computation fall into this category. "Computer memory used to be very small, and then it grew larger and larger," said James Scapa, founder and CEO of Altair. "We changed the way one of the tools worked, and this innovation had a disruptive impact on the market. Essentially, we put all the models into memory. This change means that our speed is about 30 times faster than our competitors. Similar changes have also occurred in HPC. The business model associated with cloud computing will be one of the major changes in the EDA field. The accompanying business model will also bring a certain degree of disruption. It is important to recognize the development in the field of computing, to understand the direction of development in the field of computing, and how to utilize computing resources."

Another similar change is still underway. "Think about parallel computing," said Jan Rabaey, a professor emeritus in the Department of Electrical Engineering and Computer Sciences at UC Berkeley and Chief Technology Officer of the IMEC System Technology Co-optimization Department. "People used to say that parallel computing was a bad idea because we didn't know how to compile it. Instead, we should use a single processor and make it as fast as possible. Then, the power consumption problem occurred, and we couldn't make the processor faster. So suddenly, parallel computing became a good idea, and that was a change."

The remaining 10% of investment is used for "third-layer" innovation. "This is not part of your current R&D, nor is it aimed at the existing market," said Banerjee of Ansys. "A classic example is when Apple launched the iPhone, which was disruptive. Amazon launched AWS, their web services, which was also disruptive. How do large companies achieve disruptive innovation? Because it's not accidental, it requires a process, and you need to dig out those places that can trigger innovation. These places are in academia and startups. You should continuously monitor the dynamics of startups, and then form a central R&D team to try to invent something yourself. But this central team does not have to invent everything. Some of it is done organically, and some of it is about bringing technology into your company."

Looking back, we can see the disruptive changes that have already occurred in the EDA field. "If I go back to the 1980s, we saw a series of ideas initially proposed by academia and startups, which changed our design methods," said Rabaey of UC Berkeley. "EDA began to use standard cells to drive design. When you first saw it, it seemed like a bad idea. Because it was very limited, you had to put the cells in rows, and so on. But it made automation possible. This basically led to logic synthesis, and we could start thinking about logical functions, optimizing them, and having a set of tools to help us transform high-level descriptions into some form and achieve automation. We take this for granted today. There are other areas - simulation, verification, behavioral synthesis - that have all eventually produced some form of disruption."

In the past 20 years, there has been little disruptive change in the EDA field, as the industry has basically been on a linear path. However, with Moore's Law shifting from planar design to multiple small chip stacks in packaging, this is changing rapidly.

"When the status quo is not good, disruptive change is more likely to happen," said Chuck Alpert, a research scientist at Cadence R&D automotive. "Think about the design team. They may know what's wrong. Maybe the engineering budget is out of control, or they are trying a new design but don't have the engineering technology. They have to do something disruptive. Today, we see an increase in design complexity and a lack of scalability. The design team will encounter something that forces them to innovate. These are situations where the status quo is not good or is declining. For EDA companies, this situation may occur when you are not the market leader. You are behind and have to do something disruptive to catch up. Or you may have always been the market leader, but the codebase is written in COBOL, which no one knows now. You will have to make changes because the trend is declining, and you are in a situation where you will die if you do not innovate."

Opportunities for innovation are there, especially in a culture of innovation. "The emergence of artificial intelligence and large language models can bring a lot of changes, as can cloud computing, to achieve rapid scaling," said Scapa of Altair. "Business models - not just technology - are part of your disruption. For startups in the EDA field, it's really difficult because only two companies dominate too much. They have been acquiring and eliminating startups and competitors for a long time. This hinders innovation."

By looking to the future, some pressures can be identified and dealt with. "What is the cycle of disruption?" asked Rabaey. "Many of these cycles have already emerged. The advantage of the roadmap is that you can identify the problems that may arise in the next 10 years. This is where academia excels - studying these roadmaps and identifying new paradigms that may arise from them. For example, expansion will continue for 5 years, or even 10 years. What should we do? Disruption is something you can't choose. The only time you go the way of disruption is when you hit a wall, when you suddenly realize 'I can't go any further.' We must rethink our design methods. One possibility is to start thinking about the third dimension, where you stack different technologies on top of each other. The simplest way is to map the old architecture to this. But it won't bring you too many benefits. You have to rethink how to use it."

Sometimes, change is externally enforced. "Design is shifting from chips to systems," said Banerjee. "If the goal is to design an electric car, my requirements are not just RTL input. My design requirements are for an electric car that can accelerate from 0 to 60 miles in one second, have a range of 500 miles, and must reach level five. These are my requirements. The EDA industry focuses on designing chips. You have to design power electronics, which is a power electronics simulation, combined with battery and motor design, and then aerodynamic workloads. It's a multi-physical world, very complex. Then you also need software, which must be written from system-level specifications and automatically compiled, and then verified."Artificial Intelligence in EDA

EDA companies have rapidly adopted some forms of artificial intelligence in their tools. "Reinforcement learning is being used to solve optimization problems," said Stelios Diamantidis, Senior Director of AI Solutions at Synopsys. "People are now using reinforcement learning for experimentation, data collection, establishing better metrics to drive optimization, and automating these optimizations. The technology itself can be applied to other problems. We started with optimizing physical layouts and floor plans, clocks in certain topologies, DTCO, and other physical-type applications. Since then, we have applied this principle to issues such as verification, where reordering tests or changing seeds can help you accelerate coverage or track down errors, and reordering vectors in testing can help you achieve manufacturing test coverage more quickly."

However, artificial intelligence is unlikely to replace existing EDA tools. "I believe we have excellent EDA products, and our customers are using these products, so the status quo is positive," Alpert said. "If we decide to use artificial intelligence to create new products, we will pay a huge price. Perhaps in the long run, we will get some benefits. If we let the entire product team say, let's start over and build something new, it will be very painful. Ultimately, you may succeed, but at the same time, you will pay a huge price."

The key to the EDA industry is to maintain continuity and ensure that customers have the tools they need to launch their next products. "We must protect our $2 billion business," Banerjee said. "A startup starts from scratch, but customers still find it difficult to accept new technologies to solve their problems. This is not just a challenge for EDA, but for the entire industry, which is why I see the third vision - collaborating with startups and then acquiring startups that already have this technology."

Alpert agrees with this. "Disruptive technology is difficult to deal with for almost all industries, not just EDA. They can invest some resources, but not too much. Or they can wait for others to innovate and buy it, which is another strategy."

But where have all the startups gone? "In the past 10 or 20 years, the existing ecosystem has collapsed," Rabaey said. "There was a time when EDA had a vibrant research space. Go to all the top universities and see that they are all researching tools. Now you can't find them anymore, they don't exist. Maybe you think scholars can publish papers, but they won't make that product. The role of startups is indeed important. In the 90s, this was a vibrant world, and it was these small companies that came up with ideas and tried them, but this has also collapsed. However, the ecosystem may rise again."

The Impact of GenAI

A large amount of investment has poured into GenAI, but much less in the EDA field. "GenAI is real and will bring us tangible results," Scapa said. "But there is too much hype, and the amount of investment does not match the returns we see today. GenAI will first decline, then rise slowly, because GenAI is a real big business. We are also doing some interesting things with traditional machine learning, which also has great potential."

But the real potential of GenAI in EDA seems a bit off-topic. "EDA does not create designs," Rabaey said. "But it is driven by design considerations. AI will become a disruptive part of the design process. AI will be a design tool to help us explore the huge choice space."The second generation of generative artificial intelligence is tackling automation issues. "Specifically, it's about some key industrial challenges," said Diamantidis of Synopsys. "This is more about economic and geopolitical pressures, talent availability, and the ability to do more with fewer resources. In the second wave, we can access data or design environments, and we can leverage these data to train models on a very large scale. Then, we can contextualize them to fit different tasks specific to the activities of the designers. We are indeed solving human-computer interaction issues. We can now explore extreme complexity."

Perhaps the greatest return on investment for GenAI is productivity. "We are committed to guiding people through the development process, helping them to enhance their problem-solving capabilities with generative artificial intelligence," said Erik Berg, a senior principal engineer at Microsoft. "Where does this data come from? I believe the richest source of data we have is in the minds of our engineers. The tools I am building not only provide solutions for our engineers but also capture other data and results from their minds simultaneously."

This is happening in many areas of the design community. Vidya Chhabria, an assistant professor at Arizona State University, said, "GenAI can definitely help non-expert users to get better. It can help non-expert users to ask the right questions—more thoughtful questions. It can help non-expert users to quickly grasp new designs and new EDA tools. Perhaps it can also help expert users to improve work efficiency or speed up their work."

But does this cause disruption? "Despite all this technology, it still takes four years to put a chip in a slot," said Diamantidis. "I'm talking about the entire process of gathering requirements, architectural exploration, design input, verification, insertion testing, and preparing instruments for silicon diagnostics and data mining. This requires a lot of manpower, money, and time, which means it has not really changed the basic principles or economic conditions of the semiconductor field."

Conclusion

Disruption is difficult and is often not detected until it becomes obvious. Many people have been paying attention to the progress of technology, changes in design practices, and the changes in the landscape from chips to systems. In addition, everyone believes that all forms of artificial intelligence may help to address these issues. From today's situation, it seems that nothing is disruptive.

Leave a Reply