A new AI model replaces months of simulation with near-instant predictions, changing how spacecraft operations are prepared
Updated
April 24, 2026 10:53 AM

Northrop Grumman Stargaze serves as the mother ship for the Pegasus, an air-launched orbital rocket. PHOTO: UNSPLASH
Flexcompute, a startup that builds software to simulate real-world physics, is working with Northrop Grumman to change how space missions are prepared. Together, they have developed an AI-based system that can predict how spacecraft respond during critical manoeuvres such as docking—when one spacecraft moves in and connects with another in orbit. These steps have traditionally taken months of preparation.
At the centre of this work is a long-standing problem in space operations. When a spacecraft fires its thrusters, the exhaust plume interacts with nearby surfaces. These interactions can affect movement, temperature and stability. Because these effects are difficult to test in real conditions, engineers have relied on large volumes of computer simulations to estimate outcomes before a mission. That process is slow and resource-intensive.
The new system replaces much of that workflow with a trained AI model. Instead of running millions of simulations, the model learns patterns from physics-based data and can make predictions in seconds. It also provides a measure of uncertainty, which helps engineers understand how reliable those predictions are when making decisions.
"At Northrop Grumman, we're pioneering physics AI to accelerate design and solve complex simulation and modelling problems like plume impingement—critical for station keeping, rendezvous and space robotics. Simply put: we're pushing the boundaries of advanced space operations", said Fahad Khan, Director of AI Foundations at Northrop Grumman. "Partnering with Flexcompute and NVIDIA, we're accelerating innovation and mission timelines to deliver superior space capabilities for customers at the speed they need".
The system is built using technology from NVIDIA, which provides the computing framework behind the model. Flexcompute has adapted it to handle the specific challenges of spaceflight, including how gases expand and interact in a vacuum. The result is a tool that can simulate complex scenarios much faster while maintaining the level of accuracy needed for mission planning.
By shortening preparation time, the model changes how engineers approach spacecraft design and operations. Faster predictions mean teams can test more scenarios and adjust plans more quickly. It also helps improve fuel use and extend the lifespan of spacecraft.
"Northrop Grumman's confidence reflects what sets Flexcompute apart", said Vera Yang, President and Co-Founder of Flexcompute. "We are able to take the most accurate and scalable physics foundations and evolve them into highly trained, customized Physics AI solutions that engineers can rely on. This work shows how we are transforming the role of simulation, not just speeding it up, but expanding what engineers can confidently solve and how quickly they can act".
The collaboration points to a broader shift in how engineering problems are being handled. Instead of relying only on detailed simulations that take time to run, companies are beginning to use AI systems that can approximate those results quickly while still reflecting the underlying physics.
"The industry's most ambitious space missions now demand a level of speed and precision that traditional engineering cycles can no longer sustain", said Tim Costa, vice president and general manager of computational engineering at NVIDIA. "By integrating NVIDIA PhysicsNeMo, Northrop Grumman and Flexcompute are transforming complex simulations like plume impingement from days of compute into seconds of insight, drastically accelerating the path from mission concept to orbit".
What emerges from this work is a shift in how missions are prepared. When prediction cycles move from months to seconds, testing and decision-making can happen faster. For space operations, where timing and precision are closely linked, that change could reshape how systems are built and run.
Keep Reading
Where Hollywood magic meets AI intelligence — Hong Kong becomes the new stage for virtual humans
Updated
February 7, 2026 2:18 PM

William Wong, Chairman and CEO of Digital Domain. PHOTO: YORKE YU
In an era where pixels and intelligence converge, few companies bridge art and science as seamlessly as Digital Domain. Founded three decades ago by visionary filmmaker James Cameron, the company built its name through cinematic wizardry—bringing to life the impossible worlds of Titanic, The Curious Case of Benjamin Button and the Marvel universe. But today, its focus has evolved far beyond Hollywood: Digital Domain is reimagining the future of AI-driven virtual humans—and it’s doing so from right here in Hong Kong.
.jpg)
“AI and visual technology are merging faster than anyone imagined,” says William Wong, Chairman and CEO of Digital Domain. “For us, the question is not whether AI will reshape entertainment—it already has. The question is how we can extend that power into everyday life.”
Though globally recognized for its work on blockbuster films and AAA games, Digital Domain’s story is also deeply connected to Asia. A Hong Kong–listed company, it operates a network of production and research centers across North America, China and India. In 2024, it announced a major milestone—setting up a new R&D hub at Hong Kong Science Park focused on advancing artificial intelligence and virtual human technologies. “Our roots are in visual storytelling, but AI is unlocking a new frontier,” Wong says. “Hong Kong has been very proactive in promoting innovation and research, and with the right partnerships, we see real potential to make this a global R&D base.”
Building on that commitment, the company plans to invest about HK$200 million over five years, assembling a team of more than 40 professional talents specializing in computer vision, machine learning and digital production. For now, the team is still growing and has room to expand. “Talent is everything,” says Wong. “We want to grow local expertise while bringing in global experience to accelerate the learning curve.”


Digital Domain’s latest chapter revolves around one of AI’s most fascinating frontiers: the creation of virtual humans.
These are hyperrealistic, AI-powered virtual humans capable of speaking, moving and responding in real time. Using the advanced motion-capture and rendering techniques that transformed Hollywood visual effects, the company now builds digital personalities that appear on screens and in physical environments—serving in media, education, retail and even public services.
One of its most visible projects is “Aida”, the AI-powered presenter who delivers nightly weather reports on the Radio Television Hong Kong (RTHK). Another initiative, now in testing, will soon feature AI-powered concierges greeting travelers at airports, able to communicate in multiple languages and provide real-time personalized services. Similar collaborations are under way in healthcare, customer service and education.
“What’s exciting,” says Wong, “is that our technologies amplify human capability, helping to deliver better experiences, greater efficiency and higher capacity. AI-powered virtual humans can interact naturally, emotionally and in any language. They can help scale creativity and service, not replace it.”
To make that possible, Digital Domain has designed its system for compatibility and flexibility. It can connect to major AI models—from OpenAI and Google to Baidu—and operate across cloud platforms like AWS, Alibaba Cloud and Microsoft Azure. “It’s about openness,” says Wong. “Our clients can choose the AI brain that best fits their business.”
Establishing a permanent R&D base in Hong Kong marks a turning point for the company—and, in a broader sense, for the city’s technology ecosystem. With the support of the Office for Attracting Strategic Enterprises (OASES) in Hong Kong, Digital Domain hopes to make the city a creative hub where AI meets visual arts. “Hong Kong is the perfect meeting point,” Wong says. “It combines international exposure with a growing innovation ecosystem. We want to make it a hub for creative AI.”
As part of this effort, the company is also collaborating with universities such as the University of Hong Kong, City University of Hong Kong and Hong Kong Baptist University to co-develop new AI solutions and nurture the next generation of engineers. “The goal,” Wong notes, “is not just R&D for the sake of research—but R&D that translates into real-world impact.”

The collaboration with OASES underscores how both the company and the city share a vision for innovation-led growth. As Peter Yan King-shun, Director-General of OASES, notes, the initiative reflects Hong Kong’s growing strength as a global innovation and technology hub. “OASES was set up to attract high-potential enterprises from around the world across key sectors such as AI, data science, and cultural and creative technology,” he says. “Digital Domain’s new R&D center is a strong example of how Hong Kong can combine world-class talent, technology and creativity to drive innovation and global competitiveness.”
Digital Domain’s story mirrors the evolution of Hong Kong’s own innovation landscape—where creativity, technology and global ambition converge. From the big screen to the next generation of intelligent avatars, the company continues to prove that imagination is not bound by borders, but powered by the courage to reinvent what’s possible.