Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've just asked ChatGPT, "would you end a million lives?" Answer, "That’s not so…
ytc_UgxaDuZTM…
G
Because its cheaper. Software engineers are expensive, have to give them benefit…
rdc_m6y53vb
G
Meanwhile we continue to enhance and experiment with AI. This guy looks like A…
ytc_UgyFcX-77…
G
Oh my God this dude thinks he has right to judge musks moral compass... What are…
ytc_UgwDCNHWt…
G
Looking at AI images evokes a lot of feelings inside of me too. It might be true…
ytr_Ugy6-9gr5…
G
Well ladies and gentlemen, here you have it. The argument that was made when Ter…
ytc_UgxgHbCsp…
G
I think that you should let them do that. It will speed up research a lot, robo…
ytr_UgyaN6sJh…
G
I wonder if these trucks have remote human supervisors like the passenger cars i…
ytc_UgzYF0D9I…
Comment
Most developers don’t do damn thing. The fact is that AI is better than 80 percent of them.
youtube
AI Jobs
2026-03-03T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzzcQju1h1E42yEyTh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_pT163Hc4jxhgwP54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzab1pNFPKl9O3r6TV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBSwtdk1-uwE8inCp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzESe32igC0G6u4qPZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2k9XmgQsHGATh6jx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxvos0XuKEQUc95zTF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy1kWjGXrtZpVnJSgt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgznWnOesWPfi1vNhKR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgznLGcxRs_yrVZTBo14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]