You’re the Guinea Pig

*This post was written by Camp Developer Relations Engineer Charlene Nicer and was first published May 1, 2025 on Substack. *

I’m finally doing it.

After sitting on this idea for weeks (and leaving way too many half-written drafts in my notes), I’m finally starting a newsletter! It’s a space where I’ll share things I actually care about—thoughts, links, rabbit holes, and random stuff I found interesting during the week.

Let’s kick things off with a topic that’s been on my mind a lot lately: ethical AI.

If you’ve seen me post about it recently, it’s because it’s showing up everywhere—and moving way faster than most people realize. And just this week, I came across something that really hit:

Reddit users were unknowingly subjected to an AI experiment that involved training a model on their posts and interactions—without any form of consent. The AI was used to simulate engagement with real users, to see if bots could meaningfully influence discussions or sentiment online.

Let that sink in: they didn’t just observe Reddit—they manipulated it.

This kind of experimentation, without consent, feels invasive—but it's far from new.

We've Always Been the Product

Long before generative AI, we were already part of countless experiments—whether we knew it or not.

Take insurance companies, for example. Many of them have access to more aggregated health data than hospitals do. Why? Because of years of collecting health records, pharmacy usage, wearable data, and behavioral trends through third-party brokers. These data sets allow them to run actuarial analyses to determine risk factors and calculate your premiums. And in many cases, people never gave explicit consent for their data to be used this way. Yet the industry profits off it.

Sound familiar?

It’s the same playbook social media companies have used for years. Meta, for example, collects enormous amounts of behavioral data—what you like, linger on, scroll past, engage with. That data doesn’t just sit there. It’s actively studied to learn how to influence you: what content you’ll click on, what ads you’ll fall for, what emotional state you’re in. Again, without your meaningful consent.

AI Just Turned the Volume Up

Here’s the difference now: AI changed the speed and scale.

AI is fueled by data. The more it consumes, the more it can simulate, predict, and influence. But these aren’t just passive insights anymore—AI agents can be trained to persuade, sell, manipulate, and even mimic real people. They’re not just observing your behavior; they’re actively participating in it.

So when we hear about Reddit experiments or AI-generated comments pretending to be real, we shouldn’t be surprised. We should be concerned.

Because the tools are getting smarter, but the ethical boundaries? They're barely keeping up.

I’m not writing this to scare you—but to say: we should care.

And if you’re still reading, maybe you care too. Let’s talk about this more. In fact, that’s what this newsletter is for—making sense of the weird, wild, and often uncomfortable intersections between technology, power, and the human experience.

See you next week.

Subscribe to Camp Network
Receive the latest updates directly to your inbox.
Mint this entry as an NFT to add it to your collection.
Verification
This entry has been permanently stored onchain and signed by its creator.