Webinar Recap: Unleashing the Power of AI Metadata: Transforming Media Management for Ultimate Visibility

August 5, 2024 · 7 min read

By Sarah L. Cook

Unleashing the Power of AI Metadata

Summary

In a recent webinar, Nick Kind and Jason Perr from Perifery were joined by Benjamin Kamp of GOD TV to discuss how AI is revolutionizing media management by enhancing metadata generation and visibility. Key points included:

  • AI Impact: AI's role in media asset management is growing, focusing on improving metadata accuracy and creating robust, searchable content.
  • Metadata Needs: Effective metadata is crucial for finding content. The challenge lies in consistently entering data and addressing tagging inaccuracies.
  • AI Prompt Engineering: Proper AI prompt design is vital for generating high-quality metadata. Detailed, well-constructed prompts are essential, and the field of AI Prompt Engineering is increasingly important.
  • Training vs. Prompting: Training AI models is necessary for specific tasks, but combining models and using AI for fine-tuning can yield powerful results even with small data sets.
  • MAM Integration: Perifery's AI tools can integrate with MAM systems to enhance metadata visualization. However, intelligent object-based storage can sometimes replace the need for a MAM by directly managing and searching content.
  • Future Outlook: AI will increasingly transform storage solutions into comprehensive management systems, simplifying content search and visibility.

For more information, check out Perifery’s AI+ data sheet or watch the webinar on-demand.


At the end of July, Nick Kind, Perifery Sales Manager for AI, hosted a webinar featuring Jason Perr, Perifery CTO of M&E Solutions, and Benjamin Kamp, Strategic Advisor and CTO of God TV. The panel of experts provided an insightful look into the future of media management, exploring how the use of AI can supercharge metadata and provide ultimate visibility into an organization’s assets.

AI is Changing the World

Setting the stage, Nick discussed how AI is changing the world in every industry—as it is being tested and utilized in a variety of ways. Shifting focus to the M&E world, Nick and his guests went on to examine the importance of AI technology in the sphere of media asset management and, specifically, the importance of metadata generation and how AI can be used to create more robust and accurate metadata. 

When asked about the critical need for metadata in today's M&E landscape, Jason replied that, “It basically comes down to enough metadata to be able to actually find your content. And I think the challenge that we've seen over the years is that there's been a lot of talk of why that metadata isn't there.” He added, “The biggest challenge is just getting people to actually enter the data first.” 

Benjamin explained that there's never enough metadata, and it's often wrong. He highlighted the challenge of tagging, noting that a lack of consistency is making tagging by humans less than helpful. The panel agreed on the need for consistency, accuracy, and a complete taxonomy in metadata. Discussing how to address these challenges, Jason suggested that it comes down to shifting the tagging process from manual entry to AI-powered automation. 

The Importance of AI Prompt Engineers and Detailed Prompts

While concerns about job displacement due to AI are valid, Jason noted that the human role in designing prompts and configuring systems is crucial. Therefore, organizations must go through a process of working with the AI or choosing the right AI and the right models or systems to ensure they get high-quality metadata every time. 

Nick pointed out the growing demand for AI Prompt Engineers and their challenges. Jason noted that a lot of different prompts are needed to generate the necessary metadata, and that it is more complex than flipping a switch. The AI Prompt Engineer needs to comprehend and develop the right questions to help the AI understand what is relevant for the task and the organizations’ needs.

Benjamin agreed, emphasizing that poorly constructed prompts can lead to the same issues that humans create. The key is to use detailed prompts that give AI specific instructions and define exactly what to focus on. He added that the value of Prompt Engineering is underrated. Understanding your goals and purpose is crucial to guiding AI correctly, even if it means asking AI for certain categorizations or tag taxonomy. Although AI can help with that, you need to help push it into that right space. 

Surprisingly, an effective AI prompt could be thousands of words! Writing AI prompts requires extensive knowledge, a command of the language, and in-depth thought processes to be successful. This is why many generic AI products don’t meet users' needs right off the shelf. It must have context for the organization, and that has to be provided in a clear and understandable way.

Training AI vs. Prompting AI

Jason explained that the amount of training needed depends on the type of model. For tasks like media library management, an out-of-the-box model can provide a lot of value. However, for more specific domains that require more control over the behavior and language, you need to train the model to understand your industry segment and language.

Benjamin added that God TV has moved more towards training and fine-tuning models to gain the level of content and categorization needed to guide users through different stages of their journey.

Jason highlighted the growing interest in the idea of mixing models or experts, which can involve multiple AI models and agents to answer the same question, with a commanding model or agent assessing responses and determining what the real final answer should be. This approach can vary based on the complexity of the content. Tasks such as object and facial recognition require some level of training, but powerful results can still be achieved with small data sets (e.g. 20 images). He noted that even a short fine-tuning exercise (often within 20 minutes), can reduce the need for GPU power and extensive data sets.

He added that mixing object recognition with a vision model can achieve impressive results. Rather than training on every car type, an object model can detect any car over a certain period, with a vision model determining if the recognized cars are relevant. Using models like GLM (an open-source 9-billion parameter vision model that you can run locally), organizations can accurately answer questions. Ultimately, a mixture of models can revolutionize the process.

How Do AI Tools Integrate with Existing MAMs?

The conversation shifted to integrations with MAM systems. Jason explained that today’s MAMs are a great place to visualize the data, while Perifery’s AI-enhanced tools generate rich metadata for ingested assets. Using a notification API, third party systems can be automatically notified about any metadata that has been generated. When used with MAM systems, Perifery’s AI tools can create powerful user experiences, with metadata displayed from transcriptions, object recognition, and facial recognition. These tools are MAM-agnostic, with many integrations either underway or complete.

Do You Really Need a MAM?

Jason acknowledged that while some organizations require a MAM, intelligent object-based storage can often remove the need for one. Perifery’s intelligent media-focused object storage is designed to understand your content, storing any metadata on the content itself. 

Jason said, “People get these big MAM systems, they get it all deployed and all configured, they spend all this money and all this time. And when you talk to them, it's like, So, why do you have this big, complicated, expensive MAM system in your small or medium organization? And so, well, it's all about finding content. We don't know what content we have. We needed a way to search for content. And to me, my response has always been like, it's like selling someone or giving someone this gigantic tool set, but all they need is the wrench.”

The capabilities in Perifery’s object storage are why its AI team chose to integrate into the storage layer itself. This approach provides organizations with complete visibility and the ability to search across all content and metadata without the complexity of a MAM. 

What Does the Future Hold?

According to Benjamin, AI’s ability to extract information from stored assets will transform storage solutions. He predicts that your storage will become your MAM, making it more workflow-centric. Jason added that while MAM systems can add value in production and distribution automation, many organizations are more concerned about managing large content libraries. He noted that the focus will be on having relevant tools to find content, opining that AI-integrated storage will provide a simple interface to search and view content.

Referencing the Perifery Intelligent Content Engine (ICE), Jason said he would be sharing more about ICE at a later date, but that it would be presenting an interface where you can just ask for what you need. Nick asked if it was akin to “an advanced file system search that actually talks back to you” and Jason replied affirmatively, “Yeah, exactly, exactly.”

Want to Learn More?  To learn more about how Perifery is supporting the media industry with AI-powered tools, download our AI+ data sheet. If you missed the live webinar, you can watch on-demand now.

Related Articles

Subscribe to updates

This field is for validation purposes and should be left unchanged.
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram