News & Events

Weighing Chip-Design-Verification Challenges for MedTech

Safety and security are huge and complex chip-design-verification challenges to be dealt with for medical technology (MedTech) applications. Acknowledging this, Lucio Lanza, managing partner of Lanza techVentures, asked panelists at SEMICON West 2022 this question: How must verification change as MedTech and other new applications retarget existing chips in new ways?

Dave Kelf, CEO of Breker Verification Systems, explained verification’s three axes.

The first is the way verification is done with large chips—a move from simulation to leverage different technologies.

The second is the verification requirement—previously functional verification and the final test of the chip. Now it includes more requirements, such as SoC integration, making sure cache coherence and integration issues as well as challenges, such as safety and security, are addressed.

The third axis is the number of different applications, he said. It was communications, consumer, and computer electronics 20 years ago. Now it’s MedTech, automotive, and other new applications with large devices that need to be verified. Safety and security become important.

Kelf concluded by noting that instead of thinking about verification as bottom-up, it must shift to a top-down paradigm to leverage the original specification of the whole design as a direct comparison with the operation o the final implementation.

Mike Chin, principal software engineer at Intel, spoke about the challenges he’s facing. Chin works in IP verification and ensures the IP adheres to industry standards. Working for a chip provider means he needs to provide a product that is going to lead the advancement and usage of that chip for years. The key to making sure these chips can withstand the test of time comes back to specifications.

Chin looks at the top-level specification IP of the chip and the functionality to ensure it adheres to those standards. “It’s a difficult problem as design complexity grows,” he said.

Lu Dai, senior director of engineering at Qualcomm and Chairman of Accellera, spoke about industry standardization and, as an engineering director, how he sees new applications of the same chip in a new node.

“I will start from the physicians’ standards,” Dai said. “MedTech for IC designers is a relatively new area. Some of us have been involved, but the majority are targeting compute and cell phone markets. From a standards perspective, MedTech will introduce new challenges for safety, security, and reliability. All present a different requirement from our traditional IC.”

Dai used cell phone chips applied to different markets as an example of reuse and addressed how to reuse existing chips and technology to apply them to the market.

“A chip was on the Mars Rover and Mars Helicopter,” he said, noting that he worked on that project, “and the verification lead for that chip. It had nothing to do with Mars. We were doing a cell phone chip that got retargeted for the Mars Rover. Yes, we did some additional work, mostly from the packaging side, but the core of the chip is the same as cell phone chip. NASA was able to use it.”

Jan Vardaman, president of TechSearch International and a packaging expert, spoke last—and pointed to that fact: “I’m the last person to make a comment because that’s traditionally where packaging has been,” she said. Packaging is ignored until people realize that chip they’re designing needs to be put in a package, she added.

“We are in a new era where we’re going to design chips, and packaging will not be last person talked to,” Vardaman said. “We’re going to be at the table in the front of all this because we are going to be figuring out with the chip designer how to make this thing work together.”

That’s because the chip is no longer just a single SoC, she explained. It’s IP blocks stitched together and reused for different applications in different ways for different performance requirements.

Those IP blocks must be verified and work together in whatever package selected for a much more complicated environment, she said

Lanza chimed in next, noting that changing the system for different applications or areas could require new verification. Kelf agreed, saying, “If you take a cell phone chip that you can reboot by plugging in your USB device and put it on the Mars Rover, that thing’s going to have to last all the way to Mars.”

An SoC and multi-chips with packages are verification challenges, Chin added. If one chip changes, the SoC verification must be done all over again. A hidden problem is execution speed of that verification because it can’t be wrong in a medical device.

From a security point of view, this means that “even when we talk about 100% coverage, it depends on the definition of your coverage,” Dai said, noting that it is never really 100%.

And, he added, “any concerns about security, safety, or the legal aspect of autonomous driving grows exponentially when you apply it to MedTech. When you talk about people hacking into your car chip, they steal your car. They hack into your MedTech device, they steal your life.

That’s a qualification process concern, he concluded. A spec for verification for safe and secure acceptance can be quite different. Experts from the medical community are needed to help as engineers. Medical experts can tell whether a 0.1% coverage gap is an acceptable risk.

Kelf jumped in to point out that the automotive industry is bringing accounting and insurance into the verification equation. He described a technical meeting at an automotive company that included an insurance accountant.

The company, well advanced in terms of electronic designs, understood that nothing can be 100% failsafe. Someone in the meeting pointed to a graph from Kelf’s presentation that showed the potential of a security issue projected to occur at a 2% or 1% possibility. The adjuster looked at the curve and translated it into insurance rate adjustments, figuring the liability given the chance of 1% possibility of the vehicle crashing.

Liability is built into the technical side of the verification plan to make sure it can cover those scenarios. The liability is the key factor in judging verification effectiveness in these applications.

Kelf asked if that will be done for MedTech. Of course. It’s a zero-defect strategy, Vardaman replied.

If you take automotive liability to MedTech, Lanza remarked, we’re back on the medical center. From a security point of view, remote access to healthcare is unsafe and we need to think about it more wisely.

Before we talk about the safety and security of remote healthcare, added Dai, wee need to think about the robustness of the wireless or wire connections that are involved. If a connection in MedTech goes down in the middle of operation, the default behavior of a system is to retry, which is dangerous in MedTech environments.

Therefore, he determined, a device qualified for MedTech, whether it’s for safety, security, or packaging, will likely qualify for other uses due to its higher level of qualification requirement.

Chin asked if the non-medical market user will pay for that same level of qualification. For example, will a smartphone manufacturer be willing to pay a premium for that extra level of qualification? Semiconductor companies will need to check the economics and determine how much a user is willing to pay for safety and reliability based on the end market application.

An engineering group that performs verification planning has amortized the cost, countered Kelf. When it’s sold to MedTech, it’s qualified for MedTech. When it’s sold as a cell phone chip, it still qualifies for MedTech, and the verification cost is lower.

While there was no disagreement among panelists, they were clear about new applications like MedTech. They will offer an opportunity to move verification and packaging engineers and verification tool providers into new, untested territory due to the need for almost failsafe safety, security, and reliability.

By EETimes