The internet connects over 4 billion of us now which, in theory, allows all of us ability to instantaneously stream live footage to anywhere in the world. The barrier to entry is now so low for streaming video online that the problem we face is no longer one of connectivity, but instead the colossal task of monitoring this insurmountable mass of new video content. In the case of online streaming, or perhaps even video surveillance footage, how do we police everything that gets uploaded to ensure the detection of anything unlawful or inappropriate?
Cogisen want to be able to automatically detect actions in video. Rather than humans manually policing hours and hours of video content to look out for inappropriate actions, it would be far more efficient if we were alerted the instant something sinister occurs.
To make this a reality Cogisen’s technology first needs to know what type of actions should be monitored in the first place. It’s important to able to recognise and distinguish between unacceptable behaviour like kicks and punches compared to innocuous actions such as handshakes and hugs. Microwork was given the job of helping them collect and annotate this action data. First of all, we collected videos of people doing various different actions on camera before we fed these videos into our annotation app. Next, our expert team of full-time annotators got to work. They carefully annotated each frame with the required attributes. In the case of action detection this meant looking for the precise moment that an action sequence began and annotating exactly what the AI should track.
After repeating this process for all the action videos we collected, we had our quality assurance team check the accuracy of each annotation before finally sending them to Cogisen to feed into their Machine Learning Models.