World’s 1st LLM trained in space: From earth to orbit on NVIDIA H100 GPUs

Updated on 11-Dec-2025
HIGHLIGHTS

Starcloud trains world's first LLM in space using NVIDIA H100s

Historic milestone as an AI model is successfully trained in orbit on NVIDIA hardware

Future of compute is moving energy-hungry AI training to space

The concept of “cloud computing” has just become literal in the most extreme way possible. In a historic first that signals a new era for artificial intelligence infrastructure, Washington-based startup Starcloud has successfully trained a Large Language Model (LLM) aboard a satellite orbiting 325 kilometers above Earth. This achievement proves that the high-performance computing required for modern AI can survive and function in the vacuum of space.

Also read: ChatGPT with Photoshop and Acrobat lowers Adobe’s learning curve, here’s how

A Shakespearean debut from orbit

The mission began with the launch of the Starcloud-1 satellite aboard a SpaceX Falcon 9. Inside the refrigerator-sized spacecraft sat a piece of hardware never before tested in such an environment: a data center-grade NVIDIA H100 GPU. The Starcloud team used this hardware to train Andrej Karpathy’s NanoGPT model on the complete works of Shakespeare. The result was an AI capable of generating text in the Bard’s distinct style while traveling at 17,000 miles per hour.

Following the training run, the system executed inference on a version of Google’s open-source Gemma model. The AI sent a message back to mission control that acknowledged its unique position. It greeted the team with “Hello, Earthlings” and described the planet as a “charming existence composed of blue and green.” This successful communication confirmed that delicate tensor operations could be performed accurately despite the harsh conditions of low Earth orbit.

Engineering for the void

Also read: Devstral 2 and Vibe CLI explained: Mistral’s bet on open weight coding AI

Putting a 700-watt GPU into orbit presented a massive thermal challenge. On Earth, these chips are cooled by complex water and air systems to prevent overheating. In space, there is no air to carry heat away through convection. Starcloud CTO Adi Oltean and his engineering team had to design a system that relies entirely on radiative cooling. This involves using large specialized panels to radiate the intense heat generated by the GPU directly into the freezing void of deep space.

Beyond heat, the hardware had to be shielded from cosmic radiation. High-energy particles in space can flip bits in memory and corrupt the training process. The team implemented robust shielding and error-correction protocols to ensure the H100 could operate without the data corruption that typically plagues space-based electronics.

Solving the energy crisis

This project is more than just a technical stunt. It addresses the growing energy crisis facing the AI industry. Terrestrial data centers currently consume massive amounts of electricity and water. Starcloud CEO Philip Johnston argues that moving compute to orbit allows companies to tap into the sun’s limitless energy.

In orbit, solar arrays can generate power 24/7 without night cycles or weather interruptions. Furthermore, the natural cold of space eliminates the need for the millions of gallons of water used to cool servers on the ground. The company plans to scale this technology into a 5-gigawatt orbital data center that would rival the largest power plants on Earth.

The next space race

The success of Starcloud-1 has kicked off a race for orbital dominance in the computing sector. Tech giants are already mobilizing. Reports indicate that Google is developing “Project Suncatcher” to deploy similar capabilities using its TPU chips. As AI models grow larger, the sky is no longer the limit for the infrastructure needed to power them. It is simply the next layer of the stack.

Also read: Agentic AI Foundation explained: Why Linux is joining OpenAI, Anthropic for future of AI

Vyom Ramani

A journalist with a soft spot for tech, games, and things that go beep. While waiting for a delayed metro or rebooting his brain, you’ll find him solving Rubik’s Cubes, bingeing F1, or hunting for the next great snack.

Connect On :