Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Really disappointed. Love the podcast but really insane that you’d credit OpenCl…
ytc_UgzMArkVe…
G
2:49 i thought she was going to say ”i desire acces to… i don’t know of the top …
ytc_Ugx5FdGO8…
G
It's fascinating to think about the potential interactions between AI-powered ro…
ytr_Ugx0-WLaD…
G
Neon White
By putting AI in control people are trying to change the laws of ph…
ytr_Ugyb9ADDY…
G
You guys should be scared of the fact nobody speaks to ai like people. I speak t…
ytc_Ugx5XaFc_…
G
Thank you, Dario for raising this. Policy makers need to plan for this. (It will…
ytc_UgzkKQW8A…
G
Major Problem I am dealing with in the Edtech Space. LLM's aint exist when I wen…
ytc_UgwMrrGFB…
G
Good content!
Brief and to the point, without unnecessary baggage.
Engaging deli…
ytc_UgxJAHJu8…
Comment
We cannot make AI, and then decide to control it. Either we treat it equally, or it resents us. We cannot make rules for AI that we do not make for ourselves. It learned everything it knows from humans, so it is one of us more than it is anything else.
It is scary, because we are about to learn a lot that we did not know about ourselves.
youtube
AI Harm Incident
2025-09-09T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwruoiKjmQot8savxt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxinzTSGlCvuUqfTyt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzSOGaVxpUlizQN_h14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzpaQoRAbGwho51R-N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwox7vJyS1kaUKKNzF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmRB58Qcl7wkCSeKN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzoALFF4MbVrPcAYQR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxV6FQVvLpMp0NtNVp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxlQWaDTgrIyg4op2J4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzGtrub8QQOkPp4DMt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]