Save your FREE seat for Streaming Media Connect this August. Register Now!

StreamingMedia.com Industry Announcements

View Press Releases --- Add Your Press Release

StreamingMedia.com provides this section as a service to its readers and customers.

Please read our new press release policy, effective February 1, 2022.

Press releases are subject to approval by the editorial staff of StreamingMedia.com and may be edited or altered for length and clarity, or to remove unsubstantiated and unverifiable claims.

All content presented within the press release section is that of the submitter. StreamingMedia.com does not necessarily endorse such content and bears no responsibility or liability for its accuracy.

Gcore Launches Advanced AI Solution for Real-Time Online Content Moderation and Compliance

Gcore launches AI-based video content moderation solution that combines computer vision, optical character recognition, and speech recognition to protect viewers and brands.

London(28 Jun 2024)

Gcore, the global edge AI, cloud, network, and security solutions provider, today launched Gcore AI Content Moderation, a real-time solution that enables online service providers to automate the moderation of audio, text, and user-generated video content without needing prior artificial intelligence (AI) or machine learning (ML) experience. It enables organisations to improve user safety and ensure compliance with regulations such as the EU’s Digital Services Act (DSA) and the UK’s Online Safety Bill (OSB).

Any platform that hosts user-generated content (UGC) — from comments to long-form video — or content that can be accessed by children, must moderate UGC to ensure its viewers are protected from offensive, violent, illegal or age-inappropriate content. Social media, gaming, ecommerce, education, and digital advertising are just some of the industries where organisations have legal obligations. Exposing users to harmful content can lead to reputational damage, legal investigation, service suspension, an operating ban, and significant fines (up to 6% of global revenue for the DSA).

Real-time AI content moderation

The dramatic growth in UGC means human moderation cannot keep pace with the challenge of identifying harmful and illegal content. High volumes overwhelm moderators, costs become prohibitive, or operational inefficiencies result in missed violations and delayed publishing of legitimate content.

 

Gcore AI Content Moderation automates the review of video content streams, flagging inappropriate content and alerting human moderators where necessary. The solution      integrates cutting-edge technologies to begin reviewing videos or live streams within seconds of their publication:

  • State-of-the-art computer vision: Advanced computer vision models are used for object detection, segmentation, and classification to identify and flag inappropriate visual content accurately.
  • Optical character recognition (OCR): Text visible in videos is converted into machine-readable format, enabling the moderation of inappropriate or sensitive textual information displayed within video content.
  • Speech recognition: Sophisticated algorithms analyse audio tracks to detect and flag foul language or hateful speech, thereby moderating audio as effectively as visual content.
  • Multiple-model output aggregation: Complex decisions, such as identifying content involving child exploitation, require the integration of multiple data points and outputs from different models. Gcore AI Content Moderation aggregates these outputs to make precise and reliable moderation decisions.

Organisations can quickly integrate Gcore AI Content Moderation into their existing      infrastructure through an API without the need for prior AI or ML experience. AI Content Moderation runs on Gcore’s global network of 180+ edge points of presence with a capacity exceeding 200 Tbps, ensuring low latency, speed, and resilience for customers.

“Human-only content moderation has become an unmanageable task — it is simply impossible to assign enough human resources and keep costs in check,” commented Alexey Petrovskikh, Head of Video Streaming at Gcore. “Gcore AI Content Moderation gives organisations the power to moderate content at a global scale, while keeping humans in the loop to review flagged content. This new service plays an essential role in enabling companies to protect their users, communities, and their own reputation — all while ensuring regulatory compliance.”