Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Human : Now gimme back my tommy gun
Robot : You humans are mentally unstable to …
ytc_Ugz34DIBm…
G
AI wasn't supposed to create art in first place. It was supposed to be used to a…
ytc_UgwUXHMgO…
G
Could a programmer look so biased as a fat unhealthy weirdo? Wonder if the AI k…
ytc_UgyPySJrM…
G
I'm not at these levels and self-taught, but...
Perhaps one could better reflect…
ytr_UgzUzOLzn…
G
These truckers voted for Trump and the unregulated AI future, so I hope they all…
ytc_UgxYwi3Cf…
G
@craigbritton1089 People dont need an AI for that... There are a lot of imagined…
ytr_Ugy6DgCQV…
G
Using AI in assignments can have various negative consequences, tutors can easil…
ytr_UgxU85BNG…
G
An interesting talk, but it only went over basic things, and the only problems i…
ytc_UgxeFrtQY…
Comment
Going to need a eugenics program to selectively breed the robot overlords. Less than 1 million humans are needed to perpetuate the species so humanity can pretend we're gods.
ALL the Earth's riches without billions of useless 'garbage' eaters. SOMEONE smarter than I has figured this out. "Just Trust Me Bro. ALL y'all will be future Robot Lubers"
youtube
AI Governance
2025-12-29T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzxHjaphucieZ6-43l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxUw4jeowk1DSS5aJh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxBBkn3ufnsTGzEG2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzcZij8AoZS9FOY0TF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyU4azUQ5R6jcaZExt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugya47IRh80ZzQcYBR54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwpcwKEyHu56GJVUkZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwRAO8eqs-WwplQaw94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgymhqZT560m83I1C4t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8S-e6RdBH9vTZS6Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]