Tech | Visa | Scholarship/School | Info Place

Gartner predicts that global chip revenue will grow 33% in 2024

It’s no secret that the AI ​​accelerator business is hot these days, with semiconductor manufacturers aggressively developing neural processing units and AI PC initiatives bringing more powerful processors to laptops, desktops, and workstations.

Gartner studied the AI ​​chip industry and found that global AI chip revenue is expected to grow 33% by 2024. Specifically, the Gartner report “Forecast Analysis: Global AI Semiconductors” details the competition among hyperscalers (some of which are developing their own chips and looking to semiconductor suppliers for help), use cases for AI chips, and the need for on-chip AI accelerators.

“In the long term, AI-based applications will move from data centers to PCs, smartphones, edge and endpoint devices,” Gartner analyst Alan Priestley wrote in the report.

Where have all these AI chips gone?

Gartner predicts that total AI chip revenue will reach $71.3 billion in 2024 (up from $53.7 billion in 2023), rising to $92 billion in 2025. Of the total AI chip revenue, computer electronics could account for $33.4 billion, or 47% of all AI chip revenue, in 2024. Other sources of AI chip revenue will be automotive electronics ($7.1 billion) and consumer electronics ($1.8 billion).

Of the $71.3 billion in AI semiconductor revenues in 2024, the majority will come from discrete and integrated application processes, discrete GPUs, and microprocessors for computation, rather than embedded microprocessors.

In 2024, discrete and integrated application processors will grow fastest in device AI semiconductor revenue.
Discrete and integrated application processors will have the fastest growth in device AI semiconductor revenue in 2024. Image source: Gartner

In terms of AI semiconductor application revenue in 2024, most of it will come from computing electronic devices, wired communication electronic devices, and automotive electronic devices.

Gartner has noted a shift in computing requirements from initial AI model training to inference, which is the process of refining everything the AI ​​model has learned during training. Gartner predicts that by 2028, more than 80% of workload accelerators deployed in data centers will be used to perform AI inference workloads, a 40% increase from 2023.

SEE: Microsoft’s new PC Copilot+ will use Qualcomm processors to run AI on-device.

AI and workload accelerators go hand in hand

Gartner predicts that by 2024, AI accelerators in servers will become a $21 billion industry.

“Today, generative artificial intelligence (GenAI) is driving demand for high-performance AI chips in data centers,” Priestley said in a press release. “AI accelerators for servers (used to offload data processing from microprocessors) will be worth $21 billion by 2024 and grow to $33 billion by 2028.”

Gartner predicts that AI workloads will also require enhancements to standard microprocessing units.

“Many of these AI applications can be executed on standard microprocessing units (MPUs), and MPU vendors are extending their processor architectures with dedicated on-chip AI accelerators to better handle these processing tasks,” Priestley wrote in a May 4 forecast analysis of global AI semiconductors.

In addition, the rise of artificial intelligence technology in data center applications will drive demand for workload accelerators, with 25% of new servers expected to be equipped with workload accelerators by 2028, up from 10% in 2023.

The Dawn of AI-Powered Personal Computers?

Gartner is optimistic about AI-enabled PCs, which are designed to run large language models locally in the background on laptops, workstations and desktops. Gartner defines AI-enabled PCs as computers with neural processing units that allow people to use AI for “everyday activities.”

The analyst firm predicts that every PC purchased by an enterprise will be an AI PC by 2026. It’s unclear whether this prediction will be true, but hyperscalers will certainly be incorporating AI into their next-generation devices.

Artificial intelligence promotes competition and cooperation among hyperscale companies

Today, AWS, Google, Meta, and Microsoft are all looking to internal AI chips, as well as hardware from NVIDIA, AMD, Qualcomm, IBM, Intel, and others. For example, Dell announced a new line of laptops that use Qualcomm Snapdragon X-series processors to run AI, while Microsoft and Apple are both looking to add OpenAI products to their hardware. Gartner expects the trend toward developing custom AI chips to continue.

Hyperscale providers are designing their own chips to gain better control over their product roadmaps, contain costs, reduce reliance on off-the-shelf silicon, exploit IP synergies and optimize performance for specific workloads, said Gaurav Gupta, an analyst at Gartner.

“Semiconductor chip foundries such as TSMC and Samsung provide technology companies with access to cutting-edge manufacturing processes,” Gupta said.

At the same time, “other companies like Arm and Synopsys provide advanced intellectual property that makes custom chip design relatively easy,” he said. Cultural changes in easy access to cloud and semiconductor assembly and test service (SATS) providers have also made it easier for hyperscalers to enter the chip design space.

“While chip development is expensive, using custom-designed chips can improve operational efficiency, reduce the cost of delivering AI-based services to users, and lower the cost for users to access new AI-based applications,” Gartner wrote in a press release.

#Gartner #predicts #global #chip #revenue #grow

Leave a Reply

Your email address will not be published. Required fields are marked *