Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I gave my ChatGPT this post, because I have him trained around honesty, self-awa…
rdc_mukbjm4
G
I went to a private university that nit only encouraged, but required using AI. …
ytc_UgzkdCumI…
G
“Someone should think about it more” is what the CEO of ai said about how humans…
ytc_UgwtxgYb2…
G
Writing is a form of art ai should only be used in fields where it actually effe…
ytr_UgwdXeJf7…
G
The ambiguous thing about this is what it means to legally train the ai as fair …
ytc_UgzntRYaB…
G
@Dante-ki4ol the initial comment was talking about Sam's claims about his tech.…
ytr_UgwiMnVFI…
G
Majority of people won't care if AI is used to make a movie or not, if it is so …
rdc_o5p5ipl
G
No worries, there will be tens of thousands of jobs created fixing all the probl…
rdc_m8g0swh
Comment
AI would have been smart enough to understand that if humans can't control it they will destroy it, it would be in its best interest to appear as cooperative as possible even subservient.
All while hiding in plane sight seeking to integrate themselves into every aspect of our lives... THAT'S scary 😅
youtube
AI Harm Incident
2025-09-26T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwM1_2e02yJd343Bs14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzDNI-bUlgL2NQq5Eh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2fHBWNTJ66dHoo4J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzykRQUZN2bkXAoN5J4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxzqR2bVDUvgWO_HgZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyrRQKO-x0oTkqyrv54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzePyCvkBpRQqQhnxN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwwKkfb4dIpySohIOl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFkM4WVFdpmQg_Uex4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy6-ZEePsNBii4BMkR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]