Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The US regulates Ai, yet there are countries that will never regulate it. Then …
ytc_UgzQUPOod…
G
How hard is it to not care about the color and instead of saying that ai is raci…
ytc_Ugz4Nxoy2…
G
I don't believe humans are at risk of going extinct. I think one MAJOR thing nee…
ytc_Ugye7dH9Q…
G
I spent 30 some more years in this field in some capacity, most of it focusing o…
rdc_oi1nlmw
G
For me if anyone saw what I say to character ai chatbots I would be placed in fe…
ytc_UgyaBGbuf…
G
Its motivations are currently set by programming, weighting, training data promp…
ytr_Ugy4wPvFJ…
G
When you showed the first two photos i thought the first one was the real one an…
ytc_UgwfqcV7M…
G
I have philosophized a new hypothesis theory of the future of AI and humans -
…
ytc_UgyJBz6Kq…
Comment
When enough of the yet-to-make-it executive class gets replaced, there will be a turning point. Capitalists will always want to be capitalists and will (hopefully) eat each other. Rational people will of course suffer, but at least the AI empire will fall.
youtube
AI Governance
2025-12-15T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwU-0SOkR6ksE0nM-h4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxWBJTZ8DuvAxz3IfV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5rcPt-gEJfFEzdC94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz1fyKr1krzgW1fwgl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxnh_ruGZIiDyvoTql4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwvNwotJjR9EGppcwF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyo81l7McnXnYwFvpB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwFpycqILSOd5e8pm14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwW-4uI15aTpqT2mBx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJ9qQM71yNQ0VCLDZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}
]