Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why does the US have to be the first with AI? The US is near the bottom of the …
ytc_Ugzqa9R6S…
G
The Indian PM literally told the biased information given For AI DATA TRAINING I…
ytc_UgzmeiR1a…
G
I work at a big company that just recently got into the AI-hype and I got the op…
ytc_UgzQJrCv3…
G
AI can be good for some things. Like if you need art passable right now (youtube…
ytc_Ugy13jBwn…
G
AI is complete nonsense. Run a sql statement, select the top religion of a regi…
ytc_Ugyf9nvA0…
G
"I created tons of silly AI art"
yeah? And I bet all of them looked obvious as…
ytc_Ugys-b3Uw…
G
4:02 Couldn't have put it better myself. This is why I have no respect for AI "a…
ytc_UgxU0Q34S…
G
If AI decides to not delete humans, I hope AI robots can learn to wipe human but…
ytc_Ugw7iYonB…
Comment
"human control of robotic warfare is essential to minimizing civillian deaths and injuries"
According to what data?
This is nonsense pure and simple. The natural progression is towards robotics. This idea was covered in one of the Gundam series. The real travesty to robot warfare is that it allows war to go on forever if it's just machines destroying machines. No loss, no gain, no resolution.
I'll bet that increasing autonomy will actually decrease
youtube
2012-11-23T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwbvkGhMJrrH_F2bu54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwfYxheMfLR3CgFwpx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4q5nJine-zAkNryF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxl8KKXeWiz-h5zeqR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzO2rKI7_VIVHcU8gB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbfF5VibJWpZZh1CV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyIuBGp4K_iiQWyVlt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwsfgKptJ7dhmqBh-14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_Ug24xi6RFcKHHyB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwo3-nhtFZ1ZSCRzn14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]