Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you imagine being a king or a queen and commissioning a painter to paint your…
ytc_UgzyFWuY8…
G
hey here is a smart idea. How about hiring security who can work shifts and watc…
ytc_UgyncMXoU…
G
they grow up aspiring to something authentic rather than being a wage slave for …
ytc_Ugxi-Hdeu…
G
Why is this tolerated? AI is doing nothing but wreaking havoc and this is just t…
ytc_UgwGSCCLk…
G
You mean AI driverless cars will have robots in them, and we will go extinct? Wh…
ytc_UgwY8otWe…
G
I love this analogy. I will say though, I have always viewed AI in the way that …
ytc_UgyqU1iV4…
G
If I had to make a prediction, in the very near future, I can see a lot of secur…
ytr_Ugw1CgVtQ…
G
god damn that ai talking about connection was legit convincing. i felt bad for i…
ytc_UgyeeP33d…
Comment
This is a frightening situation that we’ve reached. We are at the beginning of dramatic changes.
Within my lifetime we’ve come from computers for specialists and ordinary people never even saw them.
Now we have them in our pockets, we interact with them every few minutes. We have computer systems that are central to the way our world works.
Potentially, before the end of my lifetime we will have AI systems that will control everything and potentially replace everyone who’s working. Then, since humans don’t contribute anything any more, it could decide that we’re no longer needed on what was our planet.
youtube
AI Governance
2025-10-14T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwJv6qrJPuSOJ9Dzgx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzbgV0m8zBuYMR5kZd4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzBrwWAphpBCBIYSjd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzuzPam-fL75AG9ZnN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1FYfDyPz-Jc2j0QV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzazVpGpSV3LhxXbjZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwEVe3U1pc8sBipyxx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw7IQV6qGc6fMjwkHl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxNopRF--g7jycTgbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwWuceQtKWX8eLf0T94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]