' Fugaku-LLM ', a large-scale language model with 13 billion parameters trained using the supercomputer 'Fugaku', was released on Friday, May 10, 2024. Fugaku-LLM is trained using its own training ...
Joint collaboration between Zyphra, AMD, and IBM delivers ZAYA1, the first large-scale Mixture-of-Experts foundation model trained entirely on an AMD platform using AMD Instinct MI300X GPUs, AMD ...
一部の結果でアクセス不可の可能性があるため、非表示になっています。
アクセス不可の結果を表示する