Robots enter the World Cup, shifting how large-scale events are run and experienced
Updated
April 8, 2026 10:35 AM

Hyundai Motor Company Dealership, Alabama, US. PHOTO: ADOBE STOCK
As the FIFA World Cup 2026 approaches, attention is beginning to shift beyond the matches themselves to how an event of this scale is organised and run. Managing teams, coordinating venues and handling large crowds requires a system that works with precision. This time, robotics is set to become part of that system.
Hyundai Motor Company, a long-time FIFA partner, is expanding its role for the 2026 tournament. Alongside its traditional responsibility of providing vehicles for teams, officials and media, the company will introduce robotics in collaboration with Boston Dynamics. Robots including Atlas and Spot are expected to be deployed at selected venues.
According to the announcement, these systems will be used to support tournament operations while contributing to safety and efficiency. They will also play a role in shaping how fans experience the event, indicating a broader use of technology within the tournament environment. While specific use cases have not been detailed, the inclusion of robotics reflects a growing effort to integrate advanced systems into large-scale public events.
The direction was introduced through the company’s global campaign, “Next Starts Now,” unveiled at the 2026 New York International Auto Show. The campaign is positioned around its wider focus on innovation across mobility and robotics, aligning with its long-standing partnership with FIFA, which now spans more than two decades. As part of the 2026 tournament, the company will also deploy its largest mobility fleet to date, working alongside these newer systems across venues.
Beyond operations, the initiative extends into community engagement. Youth football camps are set to take place across four host cities in the United States—Atlanta, Miami, New Jersey and Los Angeles—targeting children between the ages of six and twelve. A global drawing programme will also invite young fans to submit artwork supporting their national teams, with selected designs to be featured on official team buses during the tournament.
Taken together, the introduction of robotics alongside existing infrastructure points to a gradual shift in how major events are supported. Rather than operating only behind the scenes, technology is becoming more visible within the event itself. How these systems perform in a live, large-scale setting will become clearer once the tournament begins.
Keep Reading
AI actor Tilly Norwood releases a musical video arguing that artificial intelligence can expand creativity in film
Updated
April 1, 2026 8:55 AM

AI Actor Tilly Norwood. PHOTO: INSTAGRAM@TILLYNORWOOD
As Hollywood prepares for this weekend’s Oscars, a different kind of performer is stepping into the spotlight — one that doesn’t physically exist.
Tilly Norwood, described as the world’s first AI actor, has released her debut musical comedy video, Take the Lead. The project arrives at a moment when artificial intelligence has become one of the most contentious topics in the film industry.
The message of the song is simple. AI should not be seen as a threat to actors. Instead, it can become another creative tool. The release also offers a first look at what Norwood’s creators call the “Tillyverse”. It is envisioned as a cloud-based entertainment world where AI characters can live, interact and perform.
Behind the character is actor and producer Eline van der Velden. She is the CEO of production company Particle6 and AI talent studio Xicoia. Van der Velden created Tilly as a way to experiment with how artificial intelligence could be used in storytelling.
The timing is not accidental. The entertainment industry has spent the past few years debating the role AI should play in filmmaking and acting. Questions about digital replicas, automated performances and creative ownership continue to divide artists and studios.
Norwood’s musical video enters that debate with a different tone. Instead of warning about AI replacing actors, the project suggests that the technology could expand what performers are able to do.
The video itself also serves as a technical experiment. The song Take the Lead was generated using the AI music platform Suno. The video was then produced using a combination of widely available AI tools and Particle6’s own creative process.
One of the newer techniques used in the project is performance capture. Van der Velden physically acted out Tilly’s movements and expressions so the digital character could mirror a human performance. But the production was far from automated. According to Particle6, a team of 18 people worked on the video. The group included a director, editor, production designer, costume designer, comedy writer and creative technologist. In other words, the project still relied heavily on human creativity.
“Tilly has always been a vehicle to test the creative capabilities and boundaries of AI,” van der Velden said. “It’s not about taking anyone’s job”. She added that even with powerful tools, good AI content still takes time, taste and creative direction.
The project also reflects how quickly production technology is evolving. Tools that once required large studios are now accessible to smaller creative teams experimenting with AI-driven storytelling.
For Particle6, the character of Tilly Norwood acts as a testing ground. Each project explores how AI performers might be developed, directed and integrated into entertainment. Whether audiences embrace digital actors remains an open question. Many in the industry are still wary of how AI could reshape creative work.
But projects like Take the Lead show another possibility. Instead of replacing performers, artificial intelligence could become part of the creative process itself. In that sense, Tilly Norwood may represent something more than a virtual performer. She is also an experiment in how humans and machines might collaborate in the future of entertainment.