Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Claude is a 50/50 tool. Half the time it makes you go twice as fast, half the ti…
rdc_oc0me5s
G
Establish legal liability of an artificial intelligence, and lawyers will take c…
ytc_UgylqqVes…
G
Surveilance belongs to law & order, not corporations. Also AI is too fucked to b…
ytc_Ugy1Q_n5Z…
G
An LLM is *always* just "making stuff up as it goes". Think of being on a game s…
ytc_UgzzK5jDY…
G
Some of these corporations are too cheap and too much in a hurry, to out-compete…
ytc_UgxOqjjj_…
G
It's interesting you mention that! The conversation with Sophia highlights the c…
ytr_UgxaaYbgN…
G
people can build robots to do the plumbing as well. I know we're enjoying AI now…
ytc_UgwKwY9q0…
G
I completely agree. The book sounds ridiculous due to the lack of meaningful evi…
ytr_Ugz-P8VsX…
Comment
Am I the only one that thinks this could be a bad idea? Maybe it's because I don't live in a first world country and I'm already behind on technological advances, but the thought of having a self-driving car terrifies me. How is it that we've become so lazy in thinking and prefer looking at our phones rather than the road that cars have to be able to decide what to do themselves, instead of the person driving it? I don't know...I find it very scary. :(
youtube
AI Harm Incident
2015-12-08T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghZNZmPiTXBqHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiYd2aPdmFuwXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghjQp_qdloJpHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9bU9RV8KAMngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiWX5v86nep_ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghCffqvRi-dsngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghvGcEpMOllvXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg06jUn2zv-nHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghbRaH1SFgP8ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghnshjCvqPWxngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]