GPT-4 "The Ultimate Revelation": 18 trillion parameters, trained once for $63 million!

Wallstreetcn
2023.07.11 08:32
portai
I'm PortAI, I can summarize articles.

GPT-4 has been "open-sourced" again. SemiAnalysis has "unveiled" a wealth of information about GPT-4. The parameter scale is more than 10 times that of GPT-3, and it adopts the MoE model architecture. GPT-4 was trained with 13 trillion tokens.