Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If this we're real, and a robot were to be bugged or shorts out. That'd be a who…
ytc_UgwTEQ1Jk…
G
Well, I am pretty sure when a self learning level AI comes out. An AI that follo…
ytc_UgzaFF-kX…
G
Someday parent's will have AI bot baby kids, no more real kids, and best part th…
ytc_UgzfUampe…
G
Nah. We’ve had 20 years to prepare for AI taking all human jobs. The average per…
ytr_Ugze91BDx…
G
When the robot inside the truck came out, I was expecting he would say, "Yo! Wha…
ytc_Ugx8353jO…
G
AI and robots are not good for the human race . Stop it now !…
ytc_UgzlD0epQ…
G
Artists: spend years learning, go trough literal depression just to get their ar…
ytc_UgzeemUKa…
G
If it's in tech, I think there should be _some_ use and integration of AI, its m…
ytr_UgwXe9gxN…
Comment
WTF is super intelligence, true AI would have our same problems. Focus it would be a glorified computer, or emotionally mess.
youtube
AI Governance
2024-12-26T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzSlLVJZfZdp74Pjs94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz23z9EJ4EE6O3_Yk94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcGM11kMsBRE6Chrt4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzLuZAHOdoSi9gSKBh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRXzYzOzW01_wp0g94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyuOE_wJbSXKTZ7dWx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwgrHq7H0VdchpYuzV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyaZHf7tfE0Tj9qrfp4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwYNh33QmeDY89xn254AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTjQF9eFfy7lSBdq54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]