A data center designed to minimize environmental impact through the use of renewable energy sources, efficient cooling technologies, and sustainable infrastructure.
A data center specifically optimized to support artificial intelligence workloads, including training and inference. These centers house GPUs or other AI-specific hardware for intensive computing tasks.
Training in AI involves feeding data to a model and adjusting its parameters to learn patterns, enabling it to make accurate predictions or decisions on new data.
A Tier-3 data center offers 99.982% uptime, with redundant systems for power and cooling, allowing maintenance without shutting down operations for high availability.
LLM (Large Language Model) is an AI model trained on vast text data, capable of understanding, generating, and predicting human-like text in various languages and contexts.
East-West traffic refers to data that moves laterally within a data center or corporate network. This includes server-to-server communications, data replication, backups, and inter-process communications.
DLC (Direct Liquid to Chip Cooling) uses liquid coolant directly on processors, efficiently removing heat and improving cooling performance in high-density data centers.
PUE (Power Usage Effectiveness) measures a data center’s energy efficiency by comparing total energy use to energy used by IT equipment. Lower PUE means higher efficiency.
HPC (High-Performance Computing) uses powerful servers and GPUs in data centers to process complex tasks like simulations, AI training, and big data analysis at high speed.
Inference in AI is the process where trained models make predictions or decisions on new data, applying learned patterns from the training phase to real-world tasks.
Free cooling in data centers uses natural cold air or water to reduce energy consumption for cooling, lowering PUE and improving overall energy efficiency.
Checkpointing in AI training saves model states at intervals, allowing progress recovery after interruptions, reducing time lost from system failures or crashes.
RDx (Rear Door Cooling) uses a heat exchanger on the back of server racks to cool equipment, reducing the need for traditional air conditioning and improving efficiency.
An AI Factory is a data center optimized for AI workloads, using high-performance GPUs, efficient cooling, and scalable infrastructure for AI training and inference tasks.
A multinode AI workload distributes AI computational tasks across multiple computing nodes or servers to improve performance, scalability, and efficiency.
North-South traffic refers to data that moves between an internal network and external networks. This includes traffic that flows from clients (such as users or external applications) to servers in a data center and vice versa.