Artificial Intelligence

How a Startup Is Using AI to Cut Space Mission Prep Cycles

A new AI model replaces months of simulation with near-instant predictions, changing how spacecraft operations are prepared

Updated

April 24, 2026 10:53 AM

Northrop Grumman Stargaze serves as the mother ship for the Pegasus, an air-launched orbital rocket. PHOTO: UNSPLASH

Flexcompute, a startup that builds software to simulate real-world physics, is working with Northrop Grumman to change how space missions are prepared. Together, they have developed an AI-based system that can predict how spacecraft respond during critical manoeuvres such as docking—when one spacecraft moves in and connects with another in orbit. These steps have traditionally taken months of preparation.

At the centre of this work is a long-standing problem in space operations. When a spacecraft fires its thrusters, the exhaust plume interacts with nearby surfaces. These interactions can affect movement, temperature and stability. Because these effects are difficult to test in real conditions, engineers have relied on large volumes of computer simulations to estimate outcomes before a mission. That process is slow and resource-intensive.

The new system replaces much of that workflow with a trained AI model. Instead of running millions of simulations, the model learns patterns from physics-based data and can make predictions in seconds. It also provides a measure of uncertainty, which helps engineers understand how reliable those predictions are when making decisions.

"At Northrop Grumman, we're pioneering physics AI to accelerate design and solve complex simulation and modelling problems like plume impingement—critical for station keeping, rendezvous and space robotics. Simply put: we're pushing the boundaries of advanced space operations", said Fahad Khan, Director of AI Foundations at Northrop Grumman. "Partnering with Flexcompute and NVIDIA, we're accelerating innovation and mission timelines to deliver superior space capabilities for customers at the speed they need".

The system is built using technology from NVIDIA, which provides the computing framework behind the model. Flexcompute has adapted it to handle the specific challenges of spaceflight, including how gases expand and interact in a vacuum. The result is a tool that can simulate complex scenarios much faster while maintaining the level of accuracy needed for mission planning.

By shortening preparation time, the model changes how engineers approach spacecraft design and operations. Faster predictions mean teams can test more scenarios and adjust plans more quickly. It also helps improve fuel use and extend the lifespan of spacecraft.

"Northrop Grumman's confidence reflects what sets Flexcompute apart", said Vera Yang, President and Co-Founder of Flexcompute. "We are able to take the most accurate and scalable physics foundations and evolve them into highly trained, customized Physics AI solutions that engineers can rely on. This work shows how we are transforming the role of simulation, not just speeding it up, but expanding what engineers can confidently solve and how quickly they can act".

The collaboration points to a broader shift in how engineering problems are being handled. Instead of relying only on detailed simulations that take time to run, companies are beginning to use AI systems that can approximate those results quickly while still reflecting the underlying physics.

"The industry's most ambitious space missions now demand a level of speed and precision that traditional engineering cycles can no longer sustain", said Tim Costa, vice president and general manager of computational engineering at NVIDIA. "By integrating NVIDIA PhysicsNeMo, Northrop Grumman and Flexcompute are transforming complex simulations like plume impingement from days of compute into seconds of insight, drastically accelerating the path from mission concept to orbit".

What emerges from this work is a shift in how missions are prepared. When prediction cycles move from months to seconds, testing and decision-making can happen faster. For space operations, where timing and precision are closely linked, that change could reshape how systems are built and run.

Keep Reading

Artificial Intelligence

The Startup Building an AI Voice Ring Raises US$23M to Rethink Human–Computer Interaction

A wearable ring, conversational AI and US$23M in funding. Sandbar wants to rethink how we interact with technology

Updated

April 1, 2026 8:55 AM

Sandbar's Stream ring. PHOTO: SANDBAR

Sandbar, a New York–based interface startup, has raised US$23 million in Series A funding to develop a wearable device that lets people interact with artificial intelligence via voice rather than screens.

Adjacent and Kindred Ventures led the round; both venture firms focused on early-stage technology startups. The investment brings Sandbar’s total funding to us$36 million. Earlier backing included a US$10 million seed round led by True Ventures, a venture capital firm, as well as a US$3 million pre-seed round supported by Upfront Ventures, a venture firm and Betaworks, a startup studio and investment firm.

Sandbar was founded by Mina Fahmi and Kirak Hong, who previously worked together at CTRL-labs, a neural interface startup acquired by Meta in 2019. Their earlier work explored how computers could respond more directly to human intent — an idea that continues to shape Sandbar’s approach to AI interfaces.

The new funding will help the company expand its team across machine learning, interaction design and software engineering as it prepares to launch its first product. That product, called Stream, combines a wearable ring with a conversational AI interface. The system allows users to speak to an AI assistant without unlocking a phone or opening an app.

The concept is simple. Instead of typing into a screen, users press a button on the ring and talk. The system can capture notes, organize ideas, retrieve information from the web or trigger actions through connected applications.

The ring includes a microphone, a touchpad and subtle haptic feedback. These elements allow the device to respond through gentle vibrations rather than visual alerts. According to the company, the ring only listens when the user presses the button — a design meant to address common concerns around always-on microphones.

That design reflects a larger shift Sandbar believes is underway. As AI assistants become more capable, many startups are experimenting with new ways to interact with them. The focus is moving away from screens and keyboards toward interfaces that feel more natural and immediate.

Stream uses multiple AI models working together to process requests, search the web and structure information in real time. The company says users remain in control of their data and can choose whether to share information with other apps.

Sandbar is also developing a feature called Inner Voice, which responds using a voice customized to the user. The feature will debut during a closed beta planned for this spring, giving the company time to refine how the software behaves in everyday use.

The startup currently employs a team of 15 people. Many have worked on well-known consumer devices including the iPhone, Fitbit, Kindle and Vision Pro. Recent hires include Sam Bowen, formerly of Amazon and Fitbit, who joined as vice president of hardware and Brooke Travis, previously at Equinox, Dior and Gap, who now leads marketing.

Sandbar plans to begin shipping Stream in summer 2026 after completing early testing. As artificial intelligence tools become more integrated into daily life, the company is betting that the next shift in computing will not come from another app — but from new ways for people to interact with AI itself.