MIT AI for Filmmaking Hackathon 2024: A Leap Forward in Creative Innovation

Jade Chongsathapornpong

What kind of space do you call home? Is it defined by the décor and furniture that make living comfortable, the cherished memories with family, the environmental and cultural symbols, or simply a sense of belonging and peace? The short films created during the second MIT AI for Filmmaking Hackathon offer insights into this question. This hackathon, held at the MIT Media Lab on February 17–18, showcased and celebrated the latest advancements in generative AI tools, including image, video, 3D content, music, and voiceover generation.

Compared to the eight films produced in its first iteration in 2023, the MIT AI for Filmmaking Hackathon 2024 saw a substantial increase in both scale and impact with 66 films. The event attracted approximately 400 participants and volunteers from local institutions such as MIT, Harvard University, and Berklee College of Music, in addition to those traveling from New York City, Los Angeles, and international locations like the UK, Canada, and China. Over the course of two days, the hackathon featured a filmmaking competition, a series of software tutorials, and panel discussions with around 40 panelists, including tech industry leaders, film directors, AI researchers, and legal experts.

The hackathon also granted participants complimentary access to the technologies of almost every leading startup in the field, such as PikaPixVerseMeshyLumaNolibox, Air3, MyShell, and Playbook.

Furthermore, additional sponsors like Tripo AIRunway ML, Deepmotion, ElevenLabs,, and Artflow contributed to the event by providing free access to their products, which allowed participants to explore their ideas in diverse ways.

“Coming from a background in the traditional film industry, I was thrilled to explore how the latest AI tools push the boundaries of filmmaking,” remarked Nicolas Lin Li, a fellow in East Asian Languages and Civilizations at Harvard University, whose work was featured at the 68th Cannes Festival. He noted, “This project unveiled the immense possibilities that AI holds to revolutionize our creative processes.” During the hackathon, 66 short films, each lasting between two to five minutes, were produced. The audience was notably taken aback by the breathtaking visuals, compelling music, and genuine voiceovers that were all created with mere clicks of a keyboard. "As the lights dimmed and the films illuminated the screen,” a participant noted, “I was deeply moved by how AI enabled us to transform the stories in our minds into reality.” Thanks to the diligent efforts of 20 judges, prizes were awarded across both traditional film festival categories and those specific to generative AI technology.

The Most Creative Use of AI award was bestowed upon "FORMER GARDEN," which leveraged 3D scanning technologies to recreate the protagonist's childhood hometown. The imperfections of the AI reconstruction were cleverly turned to its advantage: the rough, incomplete depictions with gaps and voids mirrored the elusive and fragmented memories of the elderly protagonist. Meanwhile, the Best Film, "Space I Call Home: A Nostalgic Astronaut," captured everyone's hearts with its poignant narrative; it narrates the story of an astronaut facing the perils of Mars, who finds solace and courage in recollections of his mother's love and guidance during his childhood. This faith and memory of home provide him with the strength to persevere.

The award for Best Video Generation was presented to "Tale of Lipu Village," set in a village populated by extraordinary animals and plants, such as flowers shaped like fried eggs, raspberries containing diamonds, and sheep that grow broccoli. The protagonist, Lipu Bear, discovers that being "abnormal" is actually a form of uniqueness. For example, there's a sheep that prefers to produce broccoli instead of wool, and an octopus that opts to become a chef due to its many tentacles. What was once considered a weakness in AI generated visuals—the blending of seemingly unrelated concepts—was transformed into its strength by the filmmakers, showcasing creativity and originality. 

Nothing But Emptiness” received a nomination for Best Video Generation. This ghost story is infused with humanity's obsession with the concept of life and death, seeking ultimate transcendence from memory that was once dying in the timeline. In the form of a memory record, the film presents a minimalist structural aesthetic style that is influenced by Margiela, Bauhaus, and surprisingly, Kanye West. In almost pure whiteness, a female scientist’s life is narrated in the space of a ghost town that illustrates AI tools’ boundless potential for symbolization.

The Best Voiceover Award was given to "Earth, Home, Turkey Farm", which uses satire to depict a scenario where alien tourists visit an Earth farm where humans are raised. The film serves as a cautionary tale, suggesting that if we fail to protect our environment and become too engrossed in entertainment, we may find ourselves overtaken by aliens, with our perceived happiness being merely an illusion. Remarkably, the AI-generated voiceover impressively mimics a tour guide's tone, complete with a touch of sarcasm. Additionally, "HOME AGAIN" presents a moving narrative in which several robots endeavor to assist an elderly man with Alzheimer's in recalling his home, with the rich and expressive voiceover distinctly characterizing each robot.

The Best 3D Generation Award was awarded to "Fish Tank," a film that utilizes a continuous shot to meticulously pan across a collection of furniture, toys, and decorations reminiscent of childhood, effectively capturing the true essence of home. It highlights the advantage of employing generated 3D assets to expedite the 3D modeling process. On the other hand, "Secret Garden" clinched the Best XR Film Award by employing 3D generation technology to meticulously create a virtual garden. Set within this captivating VR environment, the narrative unfolds as three characters come forward to share their unique stories, exemplifying the creative integration of extended reality in the realm of storytelling.

The Best Music Award was given to "NESTLED UNIVERSES," which also received a nomination for Best Video Generation. This film weaves together the diverse stories of individuals across the cosmos who find belonging and solace in the most unexpected places—from a verdant microgravity garden aboard a space station to the warm, welcoming glow of a cozy house on Earth, where the unconditional love of a loyal dog turns any space into a sanctuary. The AI-generated music pieces significantly enhance the film's atmosphere, enriching the emotional depth of each story.

The hackathon also invested significant efforts in educating participants on the use of generative AI tools and their integration into the production pipeline. On February 17, a series of workshops and tutorials were conducted to facilitate learning among participants. Yetong Xin and Candice Wu of Harvard University led tutorials on computer programs Blender and Unreal Engine, respectively. A series of workshops were presented by the sponsor companies: Playbook co-founder Jean-Daniel LeRoy, Meshy founder and MIT alumnus Ethan Yuanming Hu, and Jessie Ma from Pika Labs. AI artist Dave Clark discussed his unique journey in AI filmmaking. Additionally, Nix Liu Xin, Jiajian Min, and Anna Borou Yu, winners of the first AI for Filmmaking Hackathon, shared their invaluable experiences and the story behind the 2023 Best Overall Film, “DOG: Dream of Galaxy.”

The hackathon was distinguished by a diverse lineup of exceptional speakers from various fields, emphasizing the interdisciplinary nature of this domain. The event launched with insights from Pat Pataranutaporn, a PhD candidate at the MIT Media Lab, and Prof. Qian Liu, Executive Dean of the Digital Media School at the Beijing Film Academy. They,  along with a host of other esteemed speakers, delved into how AI technology can inspire, challenge, and expand people's thinking and creativity. 

The academic panel, moderated by MIT Media Lab researcher Ruihan Zhang, included a distinguished lineup: MIT Assistant Professor Vincent Sitzmann, Lu Jiang from ByteDance (author of VideoPoet), Meta researcher Rohit Girdhar (author of Emu Video), Assistant Professor James Tompkin from Brown University, and Kshitiz Garg, Senior Manager at Adobe. The discussion kicked off with their insights on Sora, OpenAI's latest video generation tool. They dove into the challenge academia faces in carving out a unique niche amidst the powerful computing resources and extensive data available in the industry. The panelists reached a consensus that many pivotal algorithms and models have their roots in academic research. The conversation culminated in a discussion on the types of research academia should pursue, highlighting the sector's unique position to undertake risky yet fundamentally important projects that promise long-term benefits.

The industry panel, also moderated by MIT Media Lab researcher Ruihan Zhang, featured notable figures: Hossein Taghavi, Senior Manager at Netflix; Dominic Laflamme, VP at Unity; and Deepti Ghadiyaram, researcher at Runway ML. They began by addressing a common concern among hackathon participants regarding the current generation of AI tools: the lack of consistency. This encompasses issues like character identity, physical object stability, and scene coherence—elements crucial to storytelling and narration. The panelists underscored resolving these consistency issues as a critical step towards the successful productization of genAI. The conversation then explored how AI technologies, particularly large language models, could revolutionize user interfaces in the near future. They envisioned a scenario where users employ text prompts to pinpoint concepts of interest (e.g., hair), subsequently utilizing a scale bar to modify attributes like hair length—a glimpse into the potential user-centric advancements driven by AI. Finally, they tackled the legal intricacies related to training data and intellectual property, discussing the implications for the industry.

The startup panel, also moderated by MIT Media Lab researcher Ruihan Zhang, featured Jessie Ma, Director of Artist Relations at Pika; Yuanming Hu, Founder and CEO of Meshy; Yachen Song, Founder and CEO of Vast; Zengyi Qin, Co-founder and AI Tech Lead at MyShell; and Matt Tancik, Researcher at Luma AI. With the recent unveiling of Sora and its impressive performance, audience interest was piqued regarding its implications for the startup landscape. The founders viewed the emergence of Sora not as a threat but as a positive indicator of the sector's growth, highlighting the increased attention and investment it brings to the industry. Discussions also covered strategies for navigating the future of this field. Some panelists advocated for building user-sharing communities as a key approach for user retention. Others pointed to the development of advanced, standalone user interfaces, suggesting that such tools, beyond mere text-based platforms like Discord, could greatly streamline the content creation process.

The legal panel, moderated by MIT undergrad Isabella Yu, featured Dr. Anna Gibson from MIT’s department of Comparative Media Studies, Professor Ari Lipsitz from Boston University School of Law, Dr. Tom Zick from Harvard’s Berkman Klein Center. The panelists discussed the most pertinent topics involving AI, ethics, and the law today, including the burgeoning energy consumption of AI models, whether or not creators could get compensation for work used in training generative AI models, and the New York Times lawsuit against OpenAI. Although panelists saw a future where AI technology helps artists reach new creative heights, they emphasized the importance of ethical protection and empathy towards human creators through which generative AI models are trained on. 

The Extended Reality (XR) panel showcased esteemed experts including Maya Georgieva, Senior Director of The New School’s Innovation Center, XR, AI, and Quantum Labs; Rashin Fahandej, Assistant Professor of Immersive and Interactive Media at Emerson College; Nix Liu Xin, Founder of Play.Work; and Baifan Tao, Founder of Air3. They recounted their personal journeys within the virtual reality sphere. The panelists highlighted how genAI could significantly streamline the development of XR content, making it more accessible and convenient for creators to bring their visions to life in immersive environments.

The film director panel, moderated by Jiajian Min (co-founder of MYStudio), included distinguished members such as Rob Minkoff, renowned for directing Disney’s The Lion King; Ben Relles, the former Head of Comedy at YouTube; Dan Sickles, the founder of dPop Studios; and Souki Mansoor, who established the consultancy Bell & Whistle. The panelists shared their visions of how AI might revolutionize the film industry, facilitating directors in realizing their creative visions more effectively. Additionally, they highlighted pressing concerns such as copyright that warrant careful consideration.

The success of this hackathon is greatly attributed to the dedication and efforts of the organizing team, with volunteers from both MIT and Harvard playing a crucial role. This event was a collaborative effort involving the MIT Filmmakers Association, MIT GSC Fund, MIT Art Center, MIT CSSA, and the MIT Media Lab. Special thanks to Pattie Maes, professor at the MIT Media Lab, Jiajian Min, co-Founder and director of MYStudio, and Jean-Peic Chou, a computer science student from Stanford University, for their tremendous efforts.

The MIT AI for Filmmaking Hackathon has garnered significant attention by facilitating connections among computer science researchers, artists, tool development startups, and venture capitalists. "We are delighted to witness participants from diverse fields building connections," remarked Ruihan Zhang, the hackathon's chief organizer and a PhD candidate at the MIT Media Lab. " With Sora on the horizon, I see it more as an opportunity than a challenge for the traditional content creation industry. It's the creativity behind the script and film that continues to be the most compelling aspect, far outweighing the technology itself."

All films made during the hackathon are available for viewing on the YouTube channel of the MIT Film Makers Association.

Related Content