My photos were stolen to promote a bogus scar cream

As I clicked on an ad using my face without my consent, I felt horrified.

My sister-in-law had sent me the link to it on Facebook Messenger one day about five years ago, but I’d never used or even heard of the product before.

As someone with facial scars from a car accident back in 2015, I have been recommended all sorts of useless lotions and potions to reduce my scars in the past – and wading through it all with a false sense of hope – seeing this was brutal.

Firstly, how dare they take my photo without asking? But also, they had the audacity to make false claims that might mislead people into buying something that probably doesn’t even work.

I wrote to the page on Facebook that was using my image – no response.

Unfortunately, there are two terrifying trends that I’m seeing all too often on social media right now – censorship and stolen photos. Both of these don’t seem to have yet been fully recognised by the policies and community guidelines that social platforms have in place to keep us safe.

Just like my photos being used without my consent, it recently happened to Canadian facial difference advocate Chelsey Peat too, who had taken part in a positive video series talking about her facial birthmark.

Footage was stolen and edited alongside an entirely different person to claim that some magic serum removed her birthmark. The awful sensationalised claims were accompanied by a classic emotional soundtrack, spinning tales about her being isolated and unable to find a partner, despite being a happily married mother.

https://www.instagram.com/p/CWWW_hpAbJa/

Advice has been sought over false information and advertisement, but the video remains in place on Facebook gathering millions of views.

I sometimes feel like I’ve surrendered my privacy and dignity to the social media powers that be. It’s as though this is something I’ve got to suck up as a consequence of me being online.

It pains me that people like me might have become resigned to the fact this is inevitable, or an unavoidable part of being out there in public online spaces. But surely there’s something that can be done?

It’s no wonder so many people with facial differences choose not to share photos, or simply aren’t on social media at all. In 2017, leading charity Changing Faces UK found that 96% of respondents had seen a photo, meme or other content on social media that mocked someone’s appearance.

In 2021, we at Face Equality International (FEI) – an Alliance of NGOs devoted to ending discrimination towards people with disfigurements – backed an open call to TikTok from our ambassador, Jono Lancaster, to remove a disgusting, derogatory trend called the ‘attractiveness’ scale.

It was basically an effect that ranked TikTok users’ faces against a series of real faces ranked one to 10. Among a list of celebrities and familiar faces, number one for both the male and female versions were someone with a facial difference. It reached millions of people.

Thanks to Jono’s courage, together we’re currently engaging with TikTok on how to better address disfigurement discrimination on their platform.

How this trend was able to go unchecked is both incredibly disheartening, but also a stark reminder of how policies and protections continue to fail the facial difference community, where these issues are not yet seen as an equality issue of significance.

It’s not just social media that’s the problem – it’s important to recognise just how dark the dark web can get.

There was a time in 2018 when I rather tragically googled my own name and I came across a website that turned out to be a racist, hate-forum. On it, someone had stolen a photo of me and a Black friend from my social media and said that I deserved to get in my accident and end up scarred because I was a race traitor.

Feeling physically ill reading it, I reported it to the police and they got back to me within a matter of hours. Sadly, as the site in question hadn’t been created in the UK, it was beyond their jurisdiction.

It’s a very specific feeling of violation when your photo is stolen from your profile without your permission and used in such a harmful way. Not least the guilt when the hate involves someone else. I can’t find the site in question anymore and can only hope it’s been shut down.

Promisingly, it appears Twitter have recently expanded their privacy policies to include the sharing of private photos and images without consent.

But there’s still so much work to do to ensure equality for people with facial differences.

There’s still so much work to do to ensure equality for people with facial differences

In late 2021, facial difference advocate Belinda Downes went to post a selfie on Twitter when a grey ‘sensitive content’ warning blurred out the image. How a face could be marked as sensitive content is beyond me, but sadly this is not an isolated incident.

Similarly, mum and activist Charlie Beswick spoke out after her son – who has a craniofacial condition and a prosthetic eye – featured in photos that were then removed from Instagram for ‘breaching guidelines’.

Whether this was a complaint, an algorithm built to detect ‘graphic’ content, or a human moderator, what remains clear is that there’s a major glitch in the social system that remains a huge concern for the facial difference community.

Off the back of the incident with Belinda, we at FEI are now working with Twitter to ensure the facial difference community is both heard, and protected on their platform.

What remains unclear is whether the problem lies in human moderation or in algorithms. Artificial intelligence is another source of fear – in that technologies built on human bias run the risk of screening out our ‘irregular’ faces for identification, security, employment and beyond – something which research shows will disproportionately affect people of colour too.

You can see our submission into a UN report on the issue here, which details research that has found people with Down’s Syndrome and Albinism have been denied photo ID cards because AI facial recognition technology has failed to recognise ‘non-standard’ faces.

It feels like there’s a lot of work for face equality activists to do to ensure our voices are heard and that, fundamentally, our faces are seen in online spaces without fear of hate, misuse or censorship.

I can only hope that working closely with social platforms will help us to ensure this community is seen, understood and respected.

Do you have a story you’d like to share? Get in touch by emailing [email protected] 

Share your views in the comments below.

Source: Read Full Article