AUSTIN, Texas — Social media and the right to privacy has been a constant discourse since the beginnings of the internet. Unfortunately, the sharing of private nudes or sexual images has only doubled over with the rise of social media use. Austin-based dating platform Bumble Inc. has joined U.K.-based nonprofit SWGFL on its StopNCII.org project to help prevent the sharing of non-consensual intimate photos online.

Bumble joins its industry peers, such as TikTok, Facebook and Instagram, in a fight to stop “revenge porn” of any kind. The dating platform prides itself in protecting its community and will not allow any room for this type of behavior.

According to Bumble, this is the “first worldwide initiative of its kind,” purposing an effort to back users fearful of photos or videos that feature nudity or sexual content being shared without their consent.

“StopNCII.org uses first-of-its-kind hashing technology to prevent private, intimate images from being shared across the tech platforms participating in this initiative. The intention is to empower members of these communities to regain control from perpetrators. Those being threatened with intimate image abuse can create unique identifiers of their image, also known as hashes or digital fingerprints,” a Bumble statement reads. 

Users on Badoo and its parent company, Bumble, can report cases of digitally shared intimate images through StopNCII.org to detect these photos. A Bumble press release breaks down what exactly goes into the detection process. “The tool features hash-generating technology that assigns a one-off numerical code to an image, creating a secure digital fingerprint without ever having to share the photo or video in question with anyone. Tech companies participating in StopNCII.org receive that “hash” and can use it to detect if someone has shared (or is trying to share) those images on their platforms.”

“While participating companies use the hash they receive from StopNCII.org to identify images that someone has shared (or is trying to share) on their platforms, the original image never leaves the original device, be it a smartphone or laptop. Only hashes, not the images themselves, are shared with StopNCII.org and participating tech platforms. This prevents further circulation and keeps those images securely in the possession of the owner,” Bumble explains in a release.  

Though StopNCII.org is for adults over 18 who are concerned an intimate image of theirs may or already have been non-consensually shared, the National Center for Missing & Exploited Children (NCMEC) has resources for people under the age of 18.