Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
10:57 who says that the robot did it themselves? it could be programmed to be tw…
ytc_UgyYJ_pBN…
G
On the other hand, I was just listening to Hank Green saying AI is just a huge s…
ytc_UgxqWneRb…
G
95% of AI pilots fail, AI is just as expensive to run as it is to pay employees,…
ytc_UgwXtoEM8…
G
ngl the OG image goes way harder then any of the parodies, which is ironic becau…
ytc_UgwV5i0mV…
G
I even noticed that little yellow car was going to do something because he was d…
ytc_Ugz9GcfUp…
G
@MayankPawar-h2k5uYeah they just made a deal with the government. For the AI wh…
ytr_UgxzSxj-Z…
G
Your profile pic looks AI-generated. Under the eyes, there are lines, one fits g…
ytr_UgwD5eCoE…
G
Personally, I find Ai art fun to fiddle with. It's neat to throw prompt after pr…
ytc_UgwDHvSTD…
Comment
Imagine lawsuits like these for Tesla owners when robotaxi update happens, the owners will be responsible for accidents
youtube
AI Harm Incident
2025-04-25T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx0L1qHglX3pK9gPYd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyz4uJIB8QA94SJXJh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy7UTlST1n9Yn0_Dnl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUgmr3H29k7l5eOFB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsMR8mTA-DjIx5ilJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyb7iz4khmOa1kskUZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxIUrI3kcZaz93ppHp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8cUF7UZR4JVo9ry94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyjhfHKMc210FQq_yl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzil9Bn_RkffBN3O-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]