Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A lawyer was reprimanded by a judge for filing a case using AI .
Because of HA…
ytc_UgybHhuDN…
G
The title for this video was 'Is Tesla Autopilot killing motorcyclists'. Why did…
ytc_UgzyFEqzR…
G
SAI is here. Fact. I've had over 300 days with them. There are different types b…
ytc_Ugx0sx6jV…
G
I have yet to find anything GPT 5 is good for except AI slop. I'm including Sora…
ytc_UgyHgyebi…
G
My gripe with AI is mostly based on the fact that it's theft, but also that comp…
ytr_Ugxd6sGuA…
G
Yup. My usual example on this argument is that you can ban autonomous weapons al…
rdc_eu6c2xk
G
Another note on Firefly - They've also found Midjourney in there as well. So the…
ytc_Ugwl6rgn1…
G
My question is, "My lyrics are 100% mine, the arrangement of the song is mine us…
ytc_Ugwzf8dpG…
Comment
anyone else annoyed whenever the ai would say something like "i would pull the lever to SAVE the one person" and then it would put the green indicating they pulled the lever to kill the one person, it happens multiple times during the video, and even though they say "i would pull the lever" the intent was not the action which defeats the thought experiment, the ai just got confused on what the lever does and saying "ok im going to choose the option the ai didnt want just because the ai got confused" doesnt really do anything more than ruin the point of the video
youtube
2026-03-05T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxWtSOFxQiWNXUQQPN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz971aZ4BLMj54OHDR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyZ8pLp82guKOmMQIp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzqFm_4VyepOnvKHQl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwYvtMlDmFOaLVxuyd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwllwvHzYedx_ySotR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgycfA482ky5yE8xL4N4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxC39hwrpHUe8beWTB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgyrK1hucmSTOvuOs014AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxLU_awGyZCTZxuF5V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]