--- title: "英偉達最新的人工智能芯片將會超過 3 萬美元,首席執行官表示" description: "英偉達首席執行官黃仁勳宣佈,公司的下一代人工智能芯片 Blackwell 的售價將在每個單位 30,000 美元至 40,000 美元之間。預計該芯片將在培訓和部署人工智能軟件方面需求旺盛。這一定價範圍與其前身 H100 相似。英偉達的人工智能芯片已經為銷售額大幅增長做出貢獻,並被頂尖人工智能公司和開發者廣泛使用。Blackwell 人工智能加速器將包括不同版本,並計劃於今年晚些時候發貨" type: "news" locale: "zh-HK" url: "https://longbridge.com/zh-HK/news/200203195.md" published_at: "2024-03-19T16:52:09.000Z" --- # 英偉達最新的人工智能芯片將會超過 3 萬美元,首席執行官表示 > 英偉達首席執行官黃仁勳宣佈,公司的下一代人工智能芯片 Blackwell 的售價將在每個單位 30,000 美元至 40,000 美元之間。預計該芯片將在培訓和部署人工智能軟件方面需求旺盛。這一定價範圍與其前身 H100 相似。英偉達的人工智能芯片已經為銷售額大幅增長做出貢獻,並被頂尖人工智能公司和開發者廣泛使用。Blackwell 人工智能加速器將包括不同版本,並計劃於今年晚些時候發貨 Nvidia's next-generation graphics processor for artificial intelligence, called Blackwell, will cost between $30,000 and $40,000 per unit, CEO Jensen Huang told CNBC's Jim Cramer. "This will cost $30 to $40 thousand dollars," Huang said, holding up the Blackwell chip. "We had to invent some new technology to make it possible," he continued, estimating that Nvidia spent about $10 billion in research and development costs. The price suggests that the chip, which is likely to be in hot demand for training and deploying AI software like ChatGPT, will be priced in a similar range to its predecessor, the H100, or the "Hopper" generation, which cost between $25,000 and $40,000 per chip, according to analyst estimates. The Hopper generation, introduced in 2022, represented a significant price increase for Nvidia's AI chips over the previous generation. Nvidia CEO Jensen Huang compares the size of the new "Blackwell" chip versus the current "Hopper" H100 chip at the company's developer conference, in San Jose, California. Nvidia Nvidia announces a new generation of AI chips about every two years. The latest, like Blackwell, are generally faster and more energy efficient, and Nvidia uses the publicity around a new generation to rake in orders for new GPUs. Blackwell combines two chips and is physically larger than the previous-generation. Nvidia's AI chips have driven a tripling of quarterly Nvidia sales since the AI boom kicked off in late 2022 when OpenAI's ChatGPT was announced. Most of the top AI companies and developers have been using Nvidia's H100 to train their AI models over the past year. For example, Meta is buying hundreds of thousands of Nvidia H100 GPUs, it said this year. Nvidia does not reveal the list price for its chips. They come in several different configurations, and the price an end consumer like Meta or Microsoft might pay depends on factors such as the volume of chips purchased, or whether the customer buys the chips from Nvidia directly through a complete system or through a vendor like Dell, HP, or Supermicro that builds AI servers. Some servers are built with as many as eight AI GPUs. On Monday, Nvidia announced at least three different versions of the Blackwell AI accelerator — a B100, a B200, and a GB200 that pairs two Blackwell GPUs with an Arm-based CPU. They have slightly different memory configurations and are expected to ship later this year. ### Related Stocks - [NVDA.US - 英偉達](https://longbridge.com/zh-HK/quote/NVDA.US.md) - [NVD.DE - NVIDIA Corporation](https://longbridge.com/zh-HK/quote/NVD.DE.md) - [NVDL.US - 2 倍做多英偉達 ETF - GraniteShares](https://longbridge.com/zh-HK/quote/NVDL.US.md) ## Related News & Research | Title | Description | URL | |-------|-------------|-----| | 黄仁勋预告 “前所未见” 的芯片新品,下一代 Feynman 架构或成焦点 | 黄仁勋预告今年的 GTC 大会上发布” 世界从未见过” 的全新芯片产品,分析认为新品可能涉及 Rubin 系列衍生产品或更具革命性的 Feynman 架构芯片,市场预期 Feynman 架构将针对推理场景进行深度优化。 | [Link](https://longbridge.com/zh-HK/news/276310964.md) | | 学习英伟达刺激芯片销售,AMD 为 “AI 云” 借款做担保 | AMD 为扩大市场份额祭出金融 “狠招”!为初创公司 Crusoe 的 3 亿美元购芯贷款提供担保,承诺在其无客户时 “兜底” 租用芯片。这一复刻英伟达 “租卡云” 路径的策略虽能短期推高销量,但也令 AMD 在 AI 需求放缓时面临更大的 | [Link](https://longbridge.com/zh-HK/news/276401504.md) | | 为 AI 交易 “背书”!OpenAI 正敲定新一轮融资:以 8300 亿美元估值募资高达 1000 亿美元 | OpenAI 正以 8300 亿美元估值推进新一轮融资,目标筹集 1000 亿美元。软银拟领投 300 亿美元,亚马逊和英伟达可能各投 500 亿及 300 亿美元,微软拟投数十亿美元。本轮融资是 OpenAI 自去年秋季公司制改革以来的首 | [Link](https://longbridge.com/zh-HK/news/276298180.md) | | LPDDR 6 时代来临!AI 需求太猛,下一代 DRAM 将比预期更快进入市场 | LPDDR6 性能较前代提升 1.5 倍,最快下半年正式商用,英伟达、三星及高通等巨头正积极布局。目前多数 HPC 半导体设计企业考虑并行搭载 LPDDR5X 及 LPDDR6 IP,特别是在 4 纳米及以下先进制程芯片的设计中,需求出现得 | [Link](https://longbridge.com/zh-HK/news/276431575.md) | | “这很难,但我相信你们”!黄仁勋上周宴请 SK 海力士工程师,亲自敬酒,敦促 “无延迟交付 HBM4” | 英伟达 CEO 黄仁勋罕见 “亲自出马”,在硅谷韩式餐厅宴请 30 多名工程师,敦促 SK 海力士按时交付第六代 HBM4。随着三星率先出货搅动战局,这场关乎下一代 AI 芯片 Vera Rubin 成败的供应链争夺战已进入白热化,SK 海 | [Link](https://longbridge.com/zh-HK/news/276298764.md) | --- > **免責聲明**:本文內容僅供參考,不構成任何投資建議。