AI Reaches for the Stars: Orbital and Space Data Centers
The relentless advancement of Large Language Models (LLMs) is catalyzing an exponential growth in data centers globally, leading to a soaring demand for energy. This pressure on the electrical grid is pushing infrastructure operators to explore alternative solutions, with some visions even extending beyond Earth's boundaries.
Into this scenario steps Orbital Inc., a Los Angeles-based startup that recently unveiled its plans to build data centers in space. Backed by Andreessen Horowitz (A16z), Orbital focuses on designing infrastructure dedicated to AI inferenceโthe phase where trained models generate their outputs. The core idea is to harness abundant and "free" solar energy in space to power computational workloads for chatbots and agents, thereby circumventing the energy limitations affecting terrestrial infrastructures. Euwyn Poon, Orbital's founder and CEO, emphasizes that capacity on Earth is insufficient and the only viable direction is upwards, where solar energy is widely available but not yet utilized.
Architecture and Technical Challenges of an Orbital Cloud
Orbital's vision materializes as a mesh constellation of small satellites in low Earth orbit (LEO). Each satellite will be equipped with a GPU server rack, powered by solar panels roughly the size of a tennis court, complemented by radiative cooling panels of comparable size. The long-term goal is to deploy up to 10,000 fridge-sized satellites, each with approximately 100 kilowatts of power, to create a distributed cloud. This approach echoes similar initiatives, such as SpaceX's proposed AI Sat Mini.
Orbital's first test is scheduled for 2027, with the launch of a prototype satellite aboard a SpaceX Falcon 9. This mission will be crucial for validating GPU operations in orbit and running the first commercial inference workloads. However, Orbital's ambition confronts the same difficulties faced by other space data center projects. These include the necessity to dissipate every watt of "free" energy as heat via large radiators, the degradation of computational hardware due to radiation in LEO, and the complexity and high costs of regular maintenance in space.
The Focus on Inference and Deployment Implications
Orbital's distinctive strategy lies in its focus on inference workloads, distributed across a network of smaller satellites and independent GPU nodes, rather than large, tightly coupled systems. This design choice is driven by the nature of inference, which generally requires less computational intensity per request compared to AI model training, which relies on massive GPU clusters optimized for high throughput. Limiting each satellite's power to approximately 100 kilowatts significantly simplifies the design, an aspect that, according to Poon, engineers appreciate.
In this model, a user request is routed from a terrestrial data center to a ground station, which acts as a relay connecting satellites to the internet. The request is then transmitted to a satellite, and satellites communicate with each other via optical laser interlinks. Once the query is processed by an available GPU, the output is routed back to the user through the network. For those evaluating on-premise deployments or hybrid solutions, Orbital's approach, though extreme, highlights the search for innovative architectures to manage AI workloads, especially when data sovereignty and control over infrastructure are priorities. However, the latency of tens of milliseconds for a round trip in LEO makes this solution less suitable for applications requiring real-time responses, such as high-frequency stock trading.
Future Prospects and Challenges to Overcome
Euwyn Poon acknowledges the significant technical hurdles inherent in operating data centers in space. Radiation can cause errors in GPUs, thermal management is complex in the absence of air (requiring heat dissipation into space), and maintenance is extremely difficult. To address these obstacles, Orbital is exploring solutions such as radiation hardening for GPUs and the adoption of ammonia-based liquid cooling loops, in addition to focusing on reducing system weight to lower launch costs.
Despite the ambitious timeline โ finalizing satellite designs by 2026, launching in 2027, and building a manufacturing facility in Los Angeles by 2028 โ Orbital's ability to operate reliably and at scale remains an open question. Industry experts, such as engineering physicist Andrew Cรดtรฉ, predict that space data centers will not be operational for at least another 10-20 years. Nevertheless, Poon expresses confidence in his company's engineering efforts to progress in solving these complex problems, aiming to serve large model labs like OpenAI and Anthropic by offering direct API access for token purchases or enterprise deals to shift inference demand to its space network.
๐ฌ Comments (0)
๐ Log in or register to comment on articles.
No comments yet. Be the first to comment!