Inside Arcee’s Trinity Large: A 400B-Parameter U.S. Open Source MoE With a Rare Raw Checkpoint
Arcee, a San Francisco–based AI lab, has released what it positions as a new U.S.-made, frontier-scale open model: Trinity Large, a 400-billion parameter mixture-of-experts (MoE) language model, alongside a rare raw 10T-token checkpoint, Trinity-Large-TrueBase. For AI researchers, ML engineers, and… Read More »Inside Arcee’s Trinity Large: A 400B-Parameter U.S. Open Source MoE With a Rare Raw Checkpoint




