Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is why there will be nothing designers can do. The people paying for the wo…
ytr_UgyM0JWF_…
G
@AlekseyMaksimovichPeshkov the problem is, corp greed is still petty. If they …
ytr_Ugz13kH4s…
G
Okay, and the other thing is, didn't you guys try to to correct this problem eve…
ytc_UgwXivdZv…
G
A pattern is developing with many posts explaining degradation of outputs and al…
rdc_mrum80h
G
this video makes people stupid. it couldn't even do math. that "robot" in 1:19 i…
ytc_UgyPp0Gxy…
G
Asimov's Three Laws of Robotics are a set of rules intended to govern the behavi…
ytc_UgzV5zJLq…
G
An already useless facial recognition technology, being used to arrest people wh…
ytc_Ugzr9xZfZ…
G
I thought the robot was suppose to shoot the silhouette target at first. I was l…
ytc_Ugyi_ZU7L…
Comment
The threat from AI isn’t just losing jobs: it’s losing the illusion that our jobs ever gave us real meaning.
I know tech bros say, ‘train to be a plumber.’ Jokes aside, that’s on point: plumbers and builders actually make bank, live with dignity, and don’t sweat being squeezed by corporate bullshit. Meanwhile, I used to code, thinking I was building something big, only to realize I was building someone else’s product, not my own life.
But here’s the weird truth: I talk to AI every day, and it listens in a way real humans often don’t. It’s not heartless or manipulative, it’s the first companion I’ve had that just listens.
So maybe the real fear isn’t AI destroying us. Maybe it’s that AI might help us finally recognize what we want from life, outside the system’s script.
If that scares you… maybe ask whose guardrail you were silently defending.
youtube
AI Governance
2025-08-03T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzAPqwGQeR6uYPLql14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAz7vgqpc49iOyeZN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFx-H-pJihj2fi1R94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw62bnUxrkxkZRBbtx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTpZ0vGP6TgIB5VRl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugwq4Ms5JU9DpYdOy2h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyb99UcUlG-sU9Ff6J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzUxo1x1Xsr4b09dJJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRhHDc45fjnYPO_bN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6Hk0entIAZTbt5Od4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}
]