Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Aiart is so annoying, and it’s even more annoying that ai could be used for somt…
ytc_UgzUplabB…
G
@zdspider6778 Apologies for the confusion. As an AI language model, I do not hav…
ytr_Ugx0iYuwP…
G
If AI cannot achieve consciousness but becomes almost infinitely computationally…
ytc_UgxcnNsft…
G
i listen to a lot of podcasts and this one was the most eye opening out of any i…
ytc_Ugyom0p61…
G
AI literally can't make tutorials to draw because it doesn't draw. It fills in p…
ytr_UgzSSgFSy…
G
If autonomous is the future, the question doesn't become how to protect trucking…
ytc_UgyAElhuk…
G
My issue is that NYT is pushing for ALL conversations for EVERY user be submitte…
ytc_UgwwSFbB9…
G
You couldn't get me into a self driving car, Not in a million years oh no, no wa…
ytc_UgzfwezMl…
Comment
I guess if you were to implement an automated driving system, you would need LIDAR on the poles or LIDAR poles that inform the central processor of the automobile, so that obstacles are acknowledged.
Just the same , a man driving six hours a day to get to work and return greatly increases the chances of being involved in an accident.
Is there any chance of reducing that probability? You have to wonder.
youtube
AI Harm Incident
2024-12-15T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzqEV4yT0aVnKybUuV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy05XBUcoOZd3pAL1p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxbiph27soTmH5kRfR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy2slWTY2HWoO4mp9d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQ8D0k3UaIP4qECEd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyU0Idc7l0q4HbBwch4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwBMko-9JWHDnXlS514AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzGlFU3Pw7VlBtSnCJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwQTz9powzUITbYhwh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyMQ80iEicXIiufX6N4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"}
]