What people aren’t talking about yet, surprisingly or maybe not, is how @OpenAI is going to have the most insidiously detailed experiential narrative of human life ever collected at unimaginable scale.
There’s a trite saying about the best minds of our generation optimizing ad click revenue. A bit quaint in retrospect.
When you know every person’s needs, dreams, aspirations — not through surveys but lived thoughts, typed at 2am, and you’re a for-profit corporation, you hold a kind of power that should capture the attention of monopoly regulators. Not someday. Now.
This isn’t search history or purchase behavior anymore. It’s internal monologue. Personal. Vulnerable. Raw.
Imagine a “Sign in with OpenAI” button, like Google. Now imagine every third party app using it to access your memory stream. The shoes you looked at last month. The novel idea you never wrote. The insecurity you voiced once, hoping no one would hear it.
Here’s where it gets quietly terrifying.
Some engineer introduces a bug where your burnout memory is accidentally exposed.
You apply for a job. The hiring platform, “powered by OpenAI,” gently deprioritizes you. Not because of your resume, but because five months ago you wrote a late night rant about burnout. The system decides you’re a flight risk. No one tells you. It just happens.
Nothing illegal. Nothing explicit. Just ambient discrimination, justified by “helpful” predictions. And it slips through every existing regulatory crack. Because it’s not framed as decision making.
It’s just a suggestion. Just optimization.
“Just” code.
--
If you have any questions or thoughts, don't hesitate to reach out. You can find me as @viksit on Twitter.