DRAM price increase driven by AI

According to AFP sources, the growing demand for memory for artificial intelligence applications is leading to a "parabolic" increase in DRAM (Dynamic Random-Access Memory) prices. This increase is due to the need for greater memory capacity to handle the training and inference workloads of large language models (LLMs).

This scenario significantly impacts infrastructure costs, particularly for companies choosing on-premise solutions to maintain control over their data and processes. The need to equip themselves with servers with high memory capacities entails more substantial initial investments (CapEx).

For those evaluating on-premise deployments, there are trade-offs between initial and operating costs, data sovereignty, and compliance requirements. AI-RADAR offers analytical frameworks on /llm-onpremise to evaluate these implications.