Maia AI Microsoft
Microsoft's custom AI accelerator chip designed for cloud AI infrastructure
Maia AI Microsoft
Microsoft's custom AI accelerator chip designed for cloud AI infrastructure
Maia 100 is Microsoft's first in-house AI accelerator chip designed specifically for running large language model inference and training workloads in Azure data centers. Developed to reduce dependence on NVIDIA GPUs and optimize for Microsoft's specific AI workloads including OpenAI model serving, Maia represents Microsoft's entry into custom AI silicon alongside similar efforts by Google (TPU), Amazon (Trainium), and Meta. The chip is deployed in Azure's AI infrastructure and signals Microsoft's long-term investment in AI compute self-sufficiency.
Key Features
- ✓Custom AI silicon
- ✓LLM inference
- ✓Azure integration
- ✓Training support
- ✓Microsoft-designed
Quick Info
- Category
- AI Infrastructure & MLOps
- Pricing
- Paid
More AI Infrastructure & MLOps Tools
Dstack
AI Infrastructure & MLOpsOpen-source cloud-agnostic platform for AI/ML workload orchestration
Tigris Data
AI Infrastructure & MLOpsAI-native object storage with built-in vector search and S3 compatibility
Superlinked
AI Infrastructure & MLOpsVector compute framework that helps ML engineers build retrieval systems by combining multiple data types a…
Qdrant Cloud
AI Infrastructure & MLOpsManaged vector database cloud service offering high-performance similarity search with filtering, payload i…