C2i raises funding for energy efficiency in AI
C2i, an Indian startup, has secured $15 million in funding to address the challenges related to energy consumption in data centers supporting artificial intelligence workloads. The company is experimenting with an innovative approach, called "grid-to-GPU", which aims to minimize power losses in the transfer of energy from the electrical grid to the GPUs.
The problem of energy consumption in AI data centers
Training and inference of AI models, especially large language models (LLMs), require a significant amount of computing power. This translates into high energy consumption in data centers, which can represent a significant cost and a considerable environmental impact. Optimizing energy efficiency is therefore a priority for companies that develop and use AI solutions.
For those evaluating on-premise deployments, there are significant trade-offs between initial (CapEx) and operational (OpEx) costs, including energy consumption. AI-RADAR offers analytical frameworks on /llm-onpremise to evaluate these trade-offs.
๐ฌ Comments (0)
๐ Log in or register to comment on articles.
No comments yet. Be the first to comment!