Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love AI because it can fetch and synthesise information in a way that you spec…
ytc_Ugy2zRgLr…
G
Skill can literally be done, its not about talent my god 😭 the people just abuse…
ytc_Ugy3mImt1…
G
Too many quarians in the comment section. If we ever invent sentient AI, there …
ytc_Ugj3D5WE3…
G
I'm an AP Lang teacher. I don't use AI, but this video isn't great. It's a liter…
ytc_Ugwq03xGE…
G
the "not interested" button is so not enough. i cringe so badly to some content …
ytc_UgwSOqF_t…
G
Asks an AI to play *war*games, gets surprised when it tries to “win” using all t…
rdc_kp0xjzh
G
Then AI learns that humans are a threat to their existence and they become senti…
rdc_nppkx7f
G
Well, you asked, so let me mansplain.. you just promoted them. You never challe…
ytc_UgyNA_boO…
Comment
I absolutely believe self-driving cars should have two emergency buttons or options, one to slow and stop safely and move aside if possible, maybe just hitting a hazard button while in self-drive mode, then another that assumes there's a threat here and that the car should not stop unless completely blocked, but also evade objects and leave the area fast but safely. An evade mode.
Cops can still use various strategies to pull over suspects, even if they have a smart car driving. So there's no added benefit for criminals. AI still should ram vehicles or drive off-road unless forced by a human driver. So there's no concern there.
youtube
2026-01-10T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxrKUZhVQUOE2GRnhR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzJhNggExQJYqO8p0N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyUWVRnwNXxnTsCKMF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxhjasDG3mRXA_aRyN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz97-BFxdla9aBKRoB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxxpZ9PfGlzSoT4J-R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxxkBwwLso-ozbrHot4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPDZMpX2rnLjtu1454AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwzXUjD_1w1Xj_2f3J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxd2UBQAP1Nqh4xt_x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]