On April 9, Sharesome announced the launch of the facepalm-inducing #FindThePornstar facial recognition tool.
I, not unlike many others, was quick to point out many reasons why I thought this tool was shortsighted at best — and potentially shit-storm-inducing at worst.
Interestingly, Sharesome took #FindThePornstar down, like, within a day of launching it. They wrote a post on Medium about the whole incident, discussing their actions and thought processes. Here’s a snippet:
We’re going to temporarily shut down #FindThePornstar and keep working on it in order to make sure that this tool will be just this: a tool that can help content creators who want to promote their content get credit, make more from their work and gain more engagement.
This will, of course, involve the option to completely opt out of appearing in the search engine. We believe that it is your right to decide whether you want to be found or not.
They also wrote that, “In the wrong hands, this technology could be used to expose people and do more damage than good… Keep in mind, this [technology] isn’t anything new that hasn’t been done before, there are literally dozens of similar sites and even big platforms (not naming any names, you already know them) that already use this type of technology for their own benefit.”
I found this very compelling — a group getting called out after a misfire who then listened to feedback and are subsequently trying to do better. What world is this?!
There's one thing we've agreed to do ever since we've started Sharesome: to listen. As a result, we've decided to shut down @FindThePornstar until we can turn it into something that can help the industry. Your opinion means everything, voice it!https://t.co/H6jWrYUtJa
— Sharesome (@SharesomeCom) April 10, 2019
When I received an email from Sharesome CEO Tudor Bold stating that they were “open to hearing from the community and learning what would be the best way forward for a service like this,” I decided to jump on the opportunity to dialogue.
I asked Bold a few questions, which follow below, along with his responses.
YNOT Cam: One point I raised was about how launching this tool from within the industry community with the specific campaign that was utilized was a big misstep. Do you have any thoughts on that?
Tudor Bold: I agree, launching the website without more preparation was a misstep. We should have considered all the implications and communicated better. On the other hand, we are really glad that we did this one-day experiment. We learned a lot about what’s possible in terms of AI-powered face-recognition. Also, this generated a really healthy debate around privacy, especially in the adult industry. And although the initial hype and people’s reactions will fade, the conclusion is: the technology is there and it will be used. We need to understand it, prepare for it, and, ideally, work with it.
Sharesome has now stated they are committed to creating a tool that allows models and performers to “opt out” if they so choose. Any idea how an “opt out” would work? Have you considered an “opt in”-type tool instead?
If we re-launch #FindThePornstar, it will have opt-out. We’re looking into how an opt-in would work.
What I would like to highlight, though, is the fact that if you are (or have been) a pornstar or cam model, it is almost impossible to hide your adult work. If somebody in your private life suspects you do or did porn, it’s already possible for them to check if that’s true or not. This is not me endorsing the situation, it’s just stating a fact.
Models should be aware that, despite potential re-assurances from studios and sites, as long as they show their face, their privacy is at risk. This is very similar to what happens to mainstream actors. The difference is — and we know it — that mainstream actors do not suffer the stigma and discrimination of sex workers. This is why we have to keep working to change this mentality, until everyone realises sex work is work.
Regarding piracy, many performers, including cam models, do not own the content they work in. So, if a model were to use your tool to identify their work located around the internet and found something that appeared to be pirated, how would they report and remove this?
As for piracy, we fully understand your point, and we’re working to address it. What we want to offer is a tool for models to find photos in which they appear. Once the content is identified, it’s for them to know if it belongs to them or to a studio they worked for.
Anything else you’d like to add?
Adult workers’ privacy is a topic that will only get bigger and generate ever more heated discussions. We do not hold all the answers. And if we mess up in communicating our initiative, we will go back to the drawing board and make things right – which we’re doing. We ourselves want to learn more and contribute to the debate by informing everyone what is technologically possible. And face-detection powered by Artificial Intelligence is here to stay.
— fin —
I very much appreciate Bold’s willingness to engage the topic and answer these additional questions. Though #FindThePornstar was more than a little bit of a facepalm WTF moment, the organization’s efforts to hear, learn and evolve/adjust are commendable.
A note to companies looking to be clever or just plain ol’ helpful in their innovations: Ask your community for feedback and insights before you have some splashy launch. You can get a lot of information from people simply by reaching out and exploring angles that you might not be able to see yourself, which is a good thing.
—
Erika is a sex positive people watcher (and writer). Email her at erika@ynotcam.com.
It’s starting to become a trend, actually. The model community has a strong voice, and it’s becoming professional suicide for those in the space to ignore the community chatter and sentiment.