Hong Kong

Mitico Pilots Community-Based Livestreaming at World of Dance Hong Kong

A Hong Kong pilot explores how creator-led distribution could reshape livestreaming for global competitions

Updated

April 8, 2026 5:28 PM

A dance crew performs in sync on stage at World of Dance under spotlights. PHOTO: WORLD OF DANCE HONG KONG

On January 22, 2026, World of Dance Hong Kong became the first global event to pilot Mitico’s community-based livestreaming model. The idea is simple: rethink how live competitions are shared in a digital-first world.

Instead of relying on a single official broadcast, the event was produced as one centralised live feed. It was then distributed across multiple creators and influencers, each hosting the stream for their own audience.

This gave creators room to add their own commentary, adapt the language and bring in cultural context that suited their communities, while the production remained consistent behind the scenes.  

“Dance is a universal language”, said David Gonzalez, President of World of Dance. “Our collaboration with Mitico to produce an international, creator-led livestream in Hong Kong allowed a regional competition to reach a global audience. With personalised commentary from hosts in different languages, we can begin to see how regional events may connect through global communities”. This approach points to a shift away from traditional broadcaster-led distribution and toward creator-led amplification.

A dance crew performs on stage as the audience watches. PHOTO: WORLD OF DANCE HONG KONG

Mitico’s approach begins with a familiar industry challenge: the high cost of production and licensing, which often makes it difficult to livestream cultural and sports events at scale.  

“Many cultural and sports competitions are never livestreamed because traditional broadcasting is too costly and complex”, said Chengcheng Li, Founder of Mitico. “By distributing a centralised production feed through creators and community hosts, regional events can reach global audiences while maintaining a unified production workflow”.

World of Dance (WOD) offered a natural test environment. It started as a global dance competition platform before entering a television partnership with NBC, which later produced four seasons of the World of Dance reality series. While the television programme concluded in 2021, the competition business has continued to expand through an international network of partners. Today, World of Dance competitions are represented in more than 72 countries, producing nearly 100 events each year, with a digital audience of more than 34 million followers across platforms

Despite that scale, many competitions are not livestreamed due to the high production costs and technical demands associated with traditional broadcasting. The Hong Kong event was selected to assess whether a community-led distribution model could offer a more scalable alternative for live coverage.

While no changes to World of Dance’s broader distribution strategy have been announced, the Hong Kong pilot offers an early indication of how global competitions may rethink livestreaming in an increasingly creator-driven media environment.

Keep Reading

Artificial Intelligence

HTC VIVERSE and World Labs Partner to Turn AI-Generated 3D Worlds Into Interactive Experiences

The focus is no longer just AI-generated worlds, but how those worlds become structured digital products

Updated

March 17, 2026 1:01 AM

The inside of a pair of HTC VR goggles. PHOTO: UNSPLASH

As AI tools improve, creating 3D content is becoming faster and easier. However, building that content into interactive experiences still requires time, structure and technical work. That difference between generation and execution is where HTC VIVERSE and World Labs are focusing their new collaboration.

HTC VIVERSE is a 3D content platform developed by HTC. It provides creators with tools to build, refine and publish interactive virtual environments. Meanwhile, World Labs is an AI startup founded by researcher Fei-Fei Li and a team of machine learning specialists. The company recently introduced Marble, a tool that generates full 3D environments from simple text, image or video prompts.

While Marble can quickly create a digital world, that world on its own is not yet a finished experience. It still needs structure, navigation and interaction. This is where VIVERSE fits in. By combining Marble’s world generation with VIVERSE’s building tools, creators can move from an AI-generated scene to a usable, interactive product.

In practice, the workflow works in two steps. First, Marble produces the base 3D environment. Then, creators bring that environment into VIVERSE, where they add game mechanics, scenes and interactive elements. In this model, AI handles the early visual creation, while the human creator defines how users explore and interact with the world.

To demonstrate this process, the companies developed three example projects. Whiskerhill turns a Marble-generated world into a simple quest-based experience. Whiskerport connects multiple AI-generated scenes into a multi-level environment that users navigate through portals. Clockwork Conspiracy, built by VIVERSE, uses Marble’s generation system to create a more structured, multi-scene game. These projects are not just demos. They serve as proof that AI-generated worlds can evolve beyond static visuals and become interactive environments.

This matters because generative AI is often judged by how quickly it produces content. However, speed alone does not create usable products. Digital experiences still require sequencing, design decisions and user interaction. As a result, the real challenge is not generation, but integration — connecting AI output to tools that make it functional.

Seen in this context, the collaboration is less about a single product and more about workflow. VIVERSE provides a system that allows AI-generated environments to be edited and structured. World Labs provides the engine that creates those environments in the first place. Together, they are testing whether AI can fit directly into a full production pipeline rather than remain a standalone tool.

Ultimately, the collaboration reflects a broader change in creative technology. AI is no longer only producing isolated assets. It is beginning to plug into the larger process of building complete experiences. The key question is no longer how quickly a world can be generated, but how easily that world can be turned into something people can actually use and explore.