MetaSeq
by MetaMetaSeq is Meta's internal library for training large-scale sequence models, used to train models like OPT (Open Pre-trained Transformer). It provides the infrastructure for efficient distributed training of language models with billions of parameters.
Specifications
- Context Window
- 2,048 tokens
- Released
- May 2022
Capabilities
Text GenerationLanguage ModelingResearch Framework
Best For
Rate this model
4.8(6 ratings)
Click to rate this AI model
Related Models
Llama 3.3 70B
128K ctxby Meta
Meta's latest open-source model offering performance comparable to Llama 3.1 405B at a fraction of the cost. Excellent for self-hosting.
Llama 3.2 Vision
128K ctxby Meta
Multimodal model with vision capabilities available in 11B and 90B parameter sizes. Supports image understanding and reasoning.
Llama 3.1 405B
128K ctxby Meta
Meta's largest open-source model with 405 billion parameters. Competitive with leading closed models on benchmarks.