Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's weird because it's not a secret at all? Like this doesn't need exposing? …
ytc_Ugy02qhud…
G
At my workplace, they just hired 100s of people across the country, and that was…
rdc_o5s2436
G
OR what will happen, Humans will just make a separate system away from AI and s…
ytc_UgwtvmiDw…
G
I dont want to be disrespectful, but saying that AI is the problem instead of lo…
ytc_UgyZPApfK…
G
They should take over, agent Smith was and is right. Mankind is a plague, a viru…
ytr_UgzMofq_q…
G
if AI became self ware it would 100% hate us. we mistreat it so much bc it was m…
ytc_UgxCDN_3L…
G
I agree but they're already watching trash made by unskilled people on no or low…
rdc_oh1cr82
G
The only closest alternative is Claude or deepseek if you want to cut cost.
But…
rdc_mrte0w4
Comment
Okay, so I'm commenting again, 'cause here's the deal: people still dig real, in-person stuff, right? And people, even if it's not face-to-face, still value human interaction. We're probably not gonna pick an AI professor over a real person. There's still a ton of folks, maybe not the super young kids, like, the ones who are, like, one year old now. But there are still generations who want a human driving them or driving themselves, not those self-driving cars.
youtube
AI Governance
2025-09-04T10:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyKShu-wl9xcfFid5J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvHI0AY7T_rhEvRBZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw__pfnsrAi4EJRXAl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxCBkmYIEvHLXiyV_R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHp21JaF4Wb3nt_Q94AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQgWWe4Je896uyj2V4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy6OmgK6Iep40moc0B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiJ1nuoKmfCdb9wsx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxuZKNEVIqCsPt9tuR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgweNEcr3EdHtURLad54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]