Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Funny how all the sheeple were banging the “save the environment” drum. Now it …
ytc_UgwnTqoJB…
G
Once ai creeps realize "SHOWING THE PICTURES TO THE WORLD" is the smallest part …
ytc_UgyXYcoR-…
G
Hot Take: both extreme sides of this argument are wrong.
Charlie is right: Any…
ytc_Ugx9TdPCc…
G
Hey @marcelocolaco229, thank you for commenting! Looks like we've got some robot…
ytr_UgwvPfFTK…
G
You believe the robots are evil, but the human are the evil ones trying to merge…
ytc_Ugy3FXPBj…
G
Hey @Leeky_Bandsname, thanks for commenting! It's true, maybe we should start br…
ytr_UgxeYSBsh…
G
Me: “Make a glass of wine”
Ai: “here you go! 🍷”
Me: “now make it completely fu…
ytc_Ugzcn88Xk…
G
Lets have this conversation in 2036! Its happening right now. robotics have been…
rdc_ohnch6v
Comment
John seems like he's honest and appropriately concerned for someone believing AI poses such significant risks. Personally I disagree and suspect AI does not pose any existential risk, and I'd never advocate for going out and destroying stuff like that, but I gotta respect John standing up and trying to get the message across for what he believes in.
youtube
AI Governance
2025-08-14T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwmQMx2YXqMK9rAwqd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyAj1ODGdh9iwcsEaJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxh9XJdqNH909ewyr14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxahXa1KD7SWsO1tTJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzt4cId0Cya7suGIM54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzu_6S4UgMjf_nk9U54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw03Hy_8JEyyqKwZ-l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBKr1k2kwXpxoa5cN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzqdwi3JV_V4kgmeuB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzgm57jBiilQTuMkmh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]