Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, if you let AI paint Starry Night and sell it for millions of dollars, that…
ytc_UgwSmwNfs…
G
I'm gonna be mighty miffed when A.I. takes over and I can't watch my weekly epis…
ytc_Ugy0A29ZP…
G
I'm really not worried about AI just deciding to "kill us all"...I'm worried abo…
ytc_UgwjtXYuq…
G
At school I was so bad that one of the art teachers complained that the only exp…
ytc_UgxQnNG9t…
G
But if everyone has a self driving car each car could dodge the first one…
ytc_UgjS60trI…
G
Looks like we've crossed the rubicon. Couple those skills with A.I. now were rea…
ytc_UgzU5Jnh-…
G
13:00 My feeble human brain can't comprehend that computers can be wrong, we ign…
ytc_UgzR8e34v…
G
Notice he DIDNT say the CDC or FDA when he was talking about regulations of stuf…
ytc_Ugw5PJo4Q…
Comment
What hes saying is we are taking a technology that can evolve on its own without any help from humans. We have no idea what its thinking for lack of a better term, we do know its evolving faster than we could mitigate a reponse, so we are no longer in control. AI doesn't need our permission nor can we stop it, do we want to take that risk? Yes humans are that dumb 😅
youtube
AI Governance
2025-10-22T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyI_y0f3SiR7UdVgDl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzUPsCoZvCpN6fCRq54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxn4xBjNtl1W7vOQdR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgywcES9tXrmxVx2prB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_Zl9rfsXp94jvL1B4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyW4SK15Q3NT_UlN-Z4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxHZArT82SjsYeQRyJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZwxPtsfeTTQ8Iw9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxvoz7tbVC_XiaHVYR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxeB8qcHvg8cUJVPPp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]