Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In 2014, I worked for a company that over a couple years grew through acquisitio…
ytc_UgxVWEc75…
G
Giving an ai who is smarter than at least 60% of humans a weapon,
What could go …
ytc_Ugx3vGuBO…
G
Corporations run by human DEMONS prosper gor the Few global elite families wile …
ytc_UgwL8TcpB…
G
The Greys are the ones who brought Artificial Intelligence and Israel is head of…
ytc_Ugxq2Ala9…
G
If you've ever tried to do anything with AI, all I can say is duh.…
ytc_UgyiOosz-…
G
What YC's Garry Tan is getting wrong is that in the end there is simply no job l…
ytc_UgwoC1zJ2…
G
AI can also FOOL YOU... AI and robots wont be here till 2050... People are SO Ea…
ytc_UgwsCMZ-K…
G
Its basically like this. Using AI is when you eat a food when your not feeling h…
ytc_Ugws3lBNr…
Comment
@8:44 I think if we use AI to build AI, then yes the risk is much higher
youtube
AI Governance
2025-06-16T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzQYp7LldfDQpXIB1N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxynsaLW9GwlkJ0cp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgztTxpnv_AVYBeSdIN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZxp3ME8MVulm45h14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwtikf5pA45GJyA_Kt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6O_NEdBP9RtI1fPN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyFOFt216HvwEWW4Ql4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxEksigG6DmJBrT9qh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwf4LGCN1zVsCM_oRV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwn72rigG97fCrylpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]