Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I see this. Back in the day we used to fight to the death to get apprentices who…
ytc_Ugxu2_-2p…
G
3 billion human lives ended on August 29th, 2032. The survivors of the nuclear f…
ytc_UgzWXgSB_…
G
I thought i was the only one, but at the same time, sometimes chatgpt is so dum …
ytc_UgyylqvRd…
G
Its alresdy hapenning. The ultra wealthy and elite invision a world where robots…
ytr_Ugz3SELOu…
G
Krystal I love breaking points but AI was not “grown in a lab as opposed to bein…
ytc_UgwCLb3uj…
G
This actually happened to me with my code for AP computer science… I am very mad…
ytc_Ugzv3sII1…
G
It Didn't work now 2 days before it worked but now Don't work i ased chatgpt it …
ytc_UgwPo91j6…
G
Lol. There seems to be a misunderstanding what "replacing a software engineer m…
ytc_UgwZEo5yr…
Comment
It’s funny cause China isn’t against regulating their own AI usage, it’s just the US that wants silicon valley to run free and ruin everything
And even if China was this big bad? I really don’t care, unless China starts WW3 I could care less about what their doing, I don’t care about any political races, I just want the regulations
youtube
2025-11-24T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgztRNnhQVU5ZaYeNl14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz1IYhu_RRLowFkr3p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwv9zsy48FLXMMXT2R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwuV8Kk0DT6IaD1T8V4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxGLLzNjESsL6DslFl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzrfDZkYmKsBwO0z9B4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeDqG3IwM3FID90kR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgxkQvxtB_YOLozea_R4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy_uyb5gpMCQD5ZNpx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwiZm247Y4TNKEYORl4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"none","emotion":"approval"}
]