Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Y'all watch too many horror films. That's literal mind control. So if something …
ytc_UgygYXvWo…
G
Interested to see Aaron's take on this - since his take in FALC seems to be pro-…
ytc_UgyQjoJuW…
G
You are being intellectually dishonest by asserting that the AI developers and r…
ytr_UgzVUpQVh…
G
@blackprop9393 no they should just throw the whole thing out as even the compute…
ytr_UgzP_xQl3…
G
There were monopoly laws and yet China has a monopoly on land and buying food fr…
ytc_Ugy7mjx5i…
G
Dude theres really people on here feeling sorry for a fucking robot wtf is wrong…
ytc_UgzSBEozA…
G
The upside about people stealing your art for AI is that, with all the horrible …
ytc_UgyfyxW8K…
G
3:40 not that I'm defending ai artists or anything, but I was thinking about the…
ytc_UgzRj7RQT…
Comment
I find it interesting that in that clip there is no definition given of what an AI is, perhaps because it's still relatively unknown, however an AI which can surpass human intelligence (and presumably interact and have emotions as we do) does not fall into the same group as a bunch of robots with very powerful processors able to run algorithms to determine a specific course of action. It is this disconnect that worries me what happens in the middle ground, I'm not worried about an omnipotent AI which can see everything and controls our daily lives, but rather a badly programed robot who proceeds to massacre a few hundred people. I would support a ban on AI for military applications till it's developed enough that we understand it's flaws.
youtube
2018-04-04T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy2wY0IUgphOxSqAuN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgySeyiz2QpQlII20MR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-oeWVwWrKW8PIxQ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzW3d21sud-E-HrJYN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzrSAz85WeSwX_VKg94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzBm_yAJquStsoKJwF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugygpk9DbFcjQDVngT54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzxMFjSdPHNG5d-ccR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxJV30qMXFyg0xs8cZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzL_pXtisAljdRajp94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]