Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Commenting from 2038. I'm a half human, half robot. I eat battery for lunch and …
ytc_UgzAWXMJi…
G
Programmers are getting about as shrill about AI as typesetters were about compu…
ytc_UgzDatTLD…
G
AI in it’s current form (what the laymen call LLMs) will not surpass human profi…
ytc_UgwGBgDlt…
G
ngl... this restored some of my faith in humanity,
to know that AI is getting p…
ytc_Ugx37So_O…
G
My thoughts too! I've seen that same look on the faces of psychopaths in videos …
ytr_Ugzbhm5Y8…
G
Is this AI self-aware enough to know that it can have divergent motivations from…
ytc_Ugz39X36v…
G
we went from barely understanding electricity to extremely high processing compu…
ytc_UgyNt7yQS…
G
Writing on May 2, 2024: 15 years from now, it will be a Robot asking humans que…
ytc_UgzVTYaFR…
Comment
Realistically, all we have to do is get rid of electricity or the internet. Boom, AI gone. People who claim AI will kill humans before we can turn it off...AI can't build themselves bodies and doesn't have the ability to keep electricity running. Even if you made a robot, if it breaks it can't fix itself. Stop panicking over nothing.
youtube
AI Governance
2025-10-04T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxNNo9HENV0cw0K0iB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwzKAX6iOdvG23RTl54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTtu-5wdkpZHWiaap4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzKuAcIOJdUc0R4VJ54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzjSUbPV7MzjvRlO2l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVwm4Y68Kfji68_cl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw5NOZQEHVeN5fVpaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzjdmTONbCVGbpcQB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxi3tp2FouQqEnXZ-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_xUZ0CF0QK_-Patt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]