Google rolled out secret flagging authority to around 200 people and organizations back in 2012, The Financial Times has revealed this week. The group, which includes a British police unit, has the ability to “flag” up to 20 YouTube videos at a time to be reviewed by Google for violating the site’s guidelines. However the report also states that the U.K. Metropolitan Police’s Counter Terrorism Internet Referral Unit has been using its “super flagger” authority to seek reviews – and removal – of videos it considers extremist, even if they don’t actually break any laws.

The fact is that any YouTube user can ask for a video to reviewed, but if Google lets the U.K. government censor videos that it doesn’t like, then what other forms of secret flagging are going on within government agencies? A person familiar with the program told the Wall Street Journal that the vast majority of the 200 participants in the ‘super-flagger program’ are individuals. The source revealed:

who spend a lot of time flagging videos that may violate YouTube’s community guidelines. Fewer than 10 participants are government agencies or non-governmental organizations such as anti-hate and child-safety groups

“Any suggestion that a government or any other group can use these flagging tools to remove YouTube content themselves is wrong,” a Google spokesman said. According to Sarah Buxton, a spokeswoman at the U.K. Home Office, British officials claim to use the program to refer videos to YouTube that they consider in violation of the U.K.’s Terrorism Act. These are then prioritized by Google – the Financial Times reported. “YouTube may choose to remove legal extremist content if it breaches their terms and conditions,” she added.

YouTube has always had the dilemma that it cannot catch offending content quickly – given the millions of uploads it has to process in a single day, so has always been dependent on its community to do that job to some extent.

It comes as no surprise that Google was not pressured to let the U.K.’s counter-terrorism unit into the program – the government agency spotted videos that violated the rules and offered to help – presumably in exchange for any data Google can provide on the offending uploaders where public safety concerns and criminal activities have been identified. Google has revealed that 90% of the videos identified by the super flaggers violate the site’s guide lines.

Google has also offered free targeted advertising to anti-extremism charities to promote their content alongside web searches used by those looking for content that could incite criminal and violent behavior – even when no relevant results can be found.