← Back to Vault

Mixture-of-Experts Inference

Cameron Rohn · Category: frameworks_and_exercises

Sparse Mixture-of-Experts models scale to over a trillion parameters by activating only a subset (e.g., 32 billion activated), but are impractical for local deployment due to resource requirements.