<nobr id="rrdlf"><dfn id="rrdlf"><cite id="rrdlf"></cite></dfn></nobr>
              <video id="rrdlf"></video>

              <video id="rrdlf"></video>
              Welcome:Beijing Plink Ai Technology Co.,LTD.Service Hotline:+86-400-127-3302
              Language: Chinese ∷  English


              Nvidia's New Software Will Help Chipmakers Produce Even More Powerful AI

              The artificial-intelligence (AI) hype is at a fever pitch right now, but the advent of generative AI over the past four months is no fluke. AI companies have been working on these breakthroughs for a long time, and with the introduction of self-learning transformer engines and the unveiling of ChatGPT last fall, it appears AI may now be at an inflection point.

              The thing about AI is that its growth possibilities aren't linear, but exponential. Based on some recent announcements, AI's learnings and capabilities continue to compound, just as interest does on good investments.

              One eye-opening example of compounding AI capability was unveiled at Nvidia's (NVDA -0.46%) GTC conference last week. The company introduced a new software library for something called "computational lithography" -- a key step in the manufacturing process for Nvidia chips themselves.

              The AI breakthrough could lead to faster and cheaper scaling of leading-edge chips. That means this AI innovation will enable the manufacturing of even more powerful AI chips in the future, which in will turn enable -- well, you get the idea.

              What is computational lithography?

              You may have heard of Moore's Law, named after Intel co-founder Gordon Moore, who correctly predicted that the number of transistors able to fit on a chip would double every year or so, essentially cutting computing prices in half every year.

              However, Nvidia CEO Jensen Huang recently declared Moore's Law "dead." His conclusion is that leading-edge chip manufacturing is now pushing up against the laws of physics, which makes it incredibly expensive to scale to the next node, outweighing the efficiency gains.

              Yet this is not to say chip scaling is over; it's just getting harder. Fortunately, Nvidia just unveiled a solution to unlocking one major chipmaking bottleneck called "computational lithography."

              Lithography is the process by which a transistor design is "printed" on a wafer. In lithography, a light source is shone through a "mask" with the wafer design on it, and then beamed through several lenses to shrink the light pattern to microscopic proportions, which then hits the wafer.

              However, the size of transistors has shrunk to just a couple of nanometers, and designs are becoming infinitely more complex and precise. In fact, Nvidia's new Hopper H100 packs a whopping 80 billion transistors onto a single chip.

              Transistor designs that complex introduce difficulty in producing the correct lenses, photomasks, and light sources needed to produce leading-edge design without any flaws. A single imperfection can lead to problems in chip quality, limiting yield and damaging the economics of leading-edge production.

              The typical way leading chipmakers have designed their more complex photomasks has been through something called computational lithography. In this process, a large computer crunches millions of variables to output the correct mask design for a given wafer architecture.

              Graphic showing the four layers of lithography.

              But therein lies another problem. The greater the complexity of the chip, the more intensive the computing workload needed for the computational lithography step. That means chipmakers need more and more data centers to process computational lithography, further driving up costs. For instance, according to industry sources, computational lithography for 3nm chips, the most advanced chips coming out this year, require about 100 times the CPU hours running Optical Proximity Correction (OPC) software than 10nm chips, which were the leading-edge chips just a few years ago.

              Nvidia's new solution: cuLitho

              Last week at its GTC event, Nvidia provided a much-needed breakthrough. The company unveiled cuLitho, a new software library designed for computational lithography. And whereas legacy computational lithography software typically ran on CPUs, cuLitho is optimized to run on Nvidia's accelerated GPU supercomputer.

              CEO Jensen Huang claimed that cuLitho, running on 500 DGX H100 supercomputers, could replace 40,000 CPU-based systems, allowing chipmakers to produce five times the photomasks per day with nine times less power -- around a fortyfold improvement over current processes. That means an intricate photomask that used to take two weeks to process can now be done in hours

              graphics showing modules for Nvidia cuLitho software.

              This could be a big deal in terms enabling the next wave of chip production. As part of the presentation, Nvidia said it's working with foundry leader Taiwan Semiconductor Manufacturing (TSM 0.17%), along with dominant lithography giant ASML Holdings (ASML -0.67%) and leading OPC software vendor Synopsis (SNPS -1.08%) on implementing cuLitho.

              TSMC plans to deploy cuLitho in June for its leading-edge production, while Synopsis CEO Aart de Geus said cuLitho has "massively accelerated" the performance of its software. And ASML CEO Peter Wennink noted the innovation will be especially relevant as ASML unveils its high-N.A. extreme ultraviolet lithography (EUV) machines, which are the next generation for EUV to be used on the 2nm node.

              Future implications

              Nvidia's H100 chips are themselves built on leading-edge nodes made by TSMC, and enabled by ASML's EUV machines and Synopsis software. Therefore, Nvidia's cuLitho innovation will in turn facilitate production of the next future Nvidia GPU, which will be even more powerful than the H100.

              This is a strange incidence of AI begetting better future AI. We've all seen how powerful ChatGPT-4 is, so the momentum behind AI innovation seems to be building, potentially leading to revolutionary breakthroughs and lots of disruption in the years ahead.