Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why didn't Ilya get an AI to write a better speech for him? This was a sloppy an…
ytc_Ugw6I8YYK…
G
What AI stans don't get is art isn't a product, it's a spectator sport. The diff…
ytc_Ugy_wQeYi…
G
I know this is anecdotal but it is true that developers are being replaced, just…
ytc_UgweEN34Z…
G
This was a tricky interview. Yudkowsky had some interesting thoughts but failed …
ytc_UgxMLKfL5…
G
Uhmmmm, am i the only one who thinks arming a robot with a tommy gun might not b…
ytc_Ugz4biW0h…
G
@slacproductions9595 i Work in German Film Business...right now... Nobody cares …
ytr_Ugyqu63p6…
G
You guys need to realize just because you don't see value in AI images or AI art…
ytc_UgzzBdSZF…
G
I don't think so, people like to pay for a comission of an artist they like, wit…
ytr_UgxJS0ntj…
Comment
In court you would probably have to prove that the car has made an equal or better decision than a similar human interaction. This is a situation where someone must lose if one agent makes a mistake. Imagine if your car had the self driving technology, and you personally were driving the car (auto-pilot off). You would be liable every single time you got into an accident. It is guaranteed to assume the car can assess the situation better than you. This is quite the paradox of liability. The car now has more faculty than the one driving it.
youtube
AI Harm Incident
2016-01-06T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugj_f2_hIfbFIngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjuSAOvpXKjoXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjXkfuodsaTaXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgiH4bJgUd72t3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugis_iWcr_zaLHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugg1rrdyzbR2AXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghCPalsjYnrLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjHJF2WYdJEkngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiXr1C50oWCgXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh_wlHO5sE7gngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]