Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A quantum computer would never fit a drone that small. A quantum computer is ver…
ytr_UgwsKAZhm…
G
This is literally the most insane shit I've ever seen, it's like a kid that has …
ytc_Ugx16Pe6a…
G
We’re just not going to get set up to be raped robbed or beaten because we want …
ytc_UgwKNdmAz…
G
“If they’re smart they’ll stay…away from consciousness” you say.
What if AI is …
ytc_UgyX5mq2J…
G
As a teacher I've got to say, pretty easy to get around AI, such as a lab, prese…
ytc_UgznEggD_…
G
TLDR: So you guy's don't have to click
"the copyright office affirmed tha…
rdc_jwv8iq8
G
Programming has only been made easier over the decades and here we are with the …
rdc_mt8j3eg
G
I wanted to be an artist or a writer. Writing a successful book has always been …
ytc_Ugywa9Np5…
Comment
Glad we got 3 more years. One guy you had here said 2027.
I am guessing in 2026, they will say 2035, then by 2030 everyone will accept AI as the new standard, and theses conspiracy theorist, againsht their own predictions, we will go back to do real work instead of monetizing fear.
youtube
AI Governance
2025-12-04T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxmPNSbOP3AtaMr0FZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQxDIWK44KJeHDL0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuCA-bcovEc7SOvtN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyayqKGGemRU9RS2PJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJJYCzVhJVZ5VuT8Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZDfcUyGIJL9JbqHF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2aNc4lmSFSvKfVJd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEDRp7FOgBt3z16I54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugybyi6TT435y7SbBQ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz3jTUQ7lPoDbeMft54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]