this post was submitted on 30 Mar 2025
419 points (100.0% liked)

Technology

68244 readers
4070 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

BDSM, LGBTQ+, and sugar dating apps have been found exposing users' private images, with some of them even leaking photos shared in private messages.

all 34 comments
sorted by: hot top controversial new old
[–] [email protected] 171 points 5 days ago* (last edited 5 days ago) (4 children)

Cybernews researchers have found that BDSM People, CHICA, TRANSLOVE, PINK, and BRISH apps had publicly accessible secrets published together with the apps’ code.

All of the affected apps are developed by M.A.D Mobile Apps Developers Limited. Their identical architecture explains why the same type of sensitive data was exposed.

What secrets were leaked?

  • API Key
  • Client ID
  • Google App ID
  • Project ID
  • Reversed Client ID
  • Storage Bucket
  • GAD Application Identifier
  • Database URL

[...] threat actors can easily abuse them to gain access to systems. In this case, the most dangerous of leaked secrets granted access to user photos located in Google Cloud Storage buckets, which had no passwords set up.

In total, nearly 1.5 million user-uploaded images, including profile photos, public posts, profile verification images, photos removed for rule violations, and private photos sent through direct messages, were left publicly accessible to anyone.

So the devs were inexperienced in secure architectures and put a bunch of stuff on the client which should probably have been on the server side. This leaves anyone open to just use their API to access every picture they have on their servers. They then made multiple dating apps with this faulty infrastructure by copy-pasting it everywhere.

I hope they are registered in a country with strong data privacy laws, so they have to feel the consequences of their mismanagement

[–] [email protected] 35 points 5 days ago (1 children)

Inexperienced? This is not-giving-a-fuck level.

[–] [email protected] 14 points 5 days ago (1 children)

No, it's lack of experience. When I was a junior dev, I had a hard enough time understanding how things worked, much less understanding how they could be compromised by an attacker.

Junior devs need senior devs to learn that kind of stuff.

[–] [email protected] 4 points 4 days ago

It does help if services that generate or store secrets and keys display a large warning that they should be kept secret, every time they're viewed, no matter the experience level of the viewer. But yeah understanding why and how isn't something that should be assumed for new devs.

[–] [email protected] 20 points 5 days ago

I've met the type who run businesses like that, and they likely do deserve punishment for it. My own experience involved someone running gray legality betting apps, and the owner was a cheapskate who got unpaid interns and filipino outsourced work to build their app. Guy didn't even pay 'em sometimes.

Granted, you could also hire inexperienced people if you're a good person with no financial investor, but that I've mostly seen with education apps and other low profit endeavors. Sex stuff definitely is someone trying to score cash.

[–] [email protected] 7 points 4 days ago

The illusion of choice

A lot of "normal" dating apps are also owned by the same companies

[–] [email protected] 2 points 4 days ago

Every single one of those “secrets” is publicly available information for every single Firebase project. The real issue is the developers didn’t have proper access control checks.

[–] [email protected] 123 points 5 days ago (2 children)

Brace yourselves, because this is only going to get worse with the current “vibe coding” trend.

[–] [email protected] 28 points 5 days ago (1 children)
[–] [email protected] 85 points 5 days ago (3 children)

Vibe coding is the current trend of having an LLM build your codebase for you then shipping it without attempting to understand the codebase.

Most developers are using LLMS to some extent to speed up their coding, as cursor and Claude are really good at removing toil. But vibe coders have the LLM build the entire thing and don't even know how it works.

[–] [email protected] 46 points 4 days ago (1 children)

In other words, vibe coders are today's technologically accelerated script kiddie.

That's arguably worse as the produced scripts may largely work and come with even less demand for understanding than a script kid's cobbling together of code may have demanded.

[–] [email protected] 5 points 4 days ago

100% accurate.

[–] [email protected] 10 points 5 days ago (4 children)
[–] [email protected] 25 points 5 days ago

Basically, think ChatGPT

[–] [email protected] 21 points 5 days ago (2 children)

Large language models (LLM) are the product of neural networks, a relatively recent innovation in the field of computer intelligence.

Since these systems are surprisingly adept at producing natural sounding language, and is good at create answers that sound correct (and sometimes actually happen to be) marketers have seized on this as an innovation, called it AI (a term with a complicated history), and have started slapping it onto every product.

[–] [email protected] 9 points 5 days ago

...neural networks, a relatively recent innovation in the field of computer intelligence.

Neural networks have been around for quite some time. The simplest forms of it have actually existed since around 1795.

[–] [email protected] 6 points 5 days ago

Ahhhhhh... that's a really simple explanation thanks

[–] [email protected] 4 points 5 days ago* (last edited 5 days ago)

A machine learning model that can generate text.

It works by converting pieces of text to "tokens" which are mapped to numbers in a way that reflects their association with other pieces of text. The model is fed input tokens and predicts tokens based on that, which are then converted to text.

[–] [email protected] 4 points 5 days ago* (last edited 5 days ago) (1 children)

Large Language Model

To the extent of my understanding, it is a form of slightly more sophisticated bot, as in an automated response algorithm, that is developed over a set of data, in order to have it "understand" the mechanics that make such set cohesive to us humans.

With such background, it is supposed to produce new similar outputs if given new raw data sets to run through the mechanics it acquired during development.

[–] [email protected] 1 points 5 days ago

Clever, thanks 😊

[–] [email protected] 3 points 4 days ago (1 children)

What is toil in this context?

[–] [email protected] 8 points 4 days ago

Boring/repetitive work. For example, I regularly use an AI coding assistant to block our basic loop templates with variables filled in, or have it quickly finish the multiple case statements or assigning values to an object with a bunch of properties.

In little things like that, it's great. But once you get past a medium sized function, it goes off the rails. I've had it make up parameters in stock library functions based on what I asked it for.

[–] [email protected] 4 points 4 days ago (2 children)

So we are moving away from >1GB node_modules finally? Or is it too soon?

[–] [email protected] 8 points 4 days ago (1 children)

Its going to be 1GB node_modules handled by garbage ai code
ai is only good at doing smaller scripts but loosing connections and understandment in larger codebases, combined with people who cant program well (i mean not only coding but debugging... as well) also called vibe programmers its going to be a mess

if a product claims it has vibecoding: find an alternative!

[–] [email protected] 9 points 4 days ago (1 children)

I'm losing my will to live lately at an alarming rate.

I used to love IT, way back at the start of 00s.

Soon after the 10s started, I noticed bullshit trends replacing one another... like crypto or clouds or SaaS... but now with the AI I just feel alienated. Like we're just all going to hell, and I hate the first row seating.

[–] [email protected] 4 points 4 days ago (1 children)

At this point, I think it’s required to have a sort of alternate identity online and keeping anything private, photos of yourself and other information just offline. Except for government stuff, which requires your real identity.

[–] [email protected] 2 points 4 days ago

I mean yeah, I selfhost everything, but I hate that i have to learn and support the most useless shit ever just to earn a living.

It used to be fun being a dev, now I'm just repeating the same warning phrases about technologies.

[–] [email protected] 2 points 4 days ago

I love feeding my bloated node_modules

[–] [email protected] 70 points 5 days ago

This is devastating. The LGBT community are often hiding their true selves because of family, colleagues, culture etc. People will be destroyed.

[–] [email protected] 41 points 5 days ago

I wonder how many conservative politicians they’ll find.

[–] [email protected] 17 points 4 days ago* (last edited 4 days ago)

Use Signal or SimpleX for more private stuff like this 👀

[–] [email protected] 14 points 4 days ago

Anyone who uses Grindr, please be aware that any photos you send are cached and stored unencrypted in plain old folders on the receiver's phone, regardless of whether they were expiring or in an album that you later revoked. It's nearly trivial to grab any photo someone sends you, with no watermark or screenshot notification.

[–] [email protected] 3 points 4 days ago

Just don't send nudes.... why do people think other people won't figure out how to screenshot or just keep photos forever? Even if you trust the person, the person could get hacked.... the pwned guy got pwned for Jehova's sake. Just stop sending that ~~shit~~.