Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What disturbs me about AI that we already see is the programming of the base is …
ytc_UgwelFs3b…
G
I have seen the argument "use AI to generate a reference image for an artist" an…
ytc_Ugw6la5_b…
G
If they can successfully do that the wind farms might not be as vital. Nuclear e…
rdc_eue1w9i
G
Now that AI can reason, it is impossible to corrupt/poison it (the opposite will…
ytc_Ugx812Tr_…
G
Yes X-ray glasses but for real…. Like AI generates what they will likely look li…
rdc_k7kp6zj
G
If they want to make robots more human like but also have them made to do our jo…
ytc_Ugy7OWDiV…
G
I might have liked this as a teen (it wasn't a thing when I was a teen), but as …
ytc_Ugwjw6N7I…
G
It irritates me that videos like this spread panic under the premise of certain …
ytc_UgzB_iftP…
Comment
I think there is three simple ways to solve this dilemma.
1. Buyers of cars must choose options of car's behavior while they buying on their new car. This options must be documented with signatures and witness. So that responsibility will be on car owner.
2. Car companies should build cars which looks the same, but programmed in different ways. Manufacturers should build some cars with drivers life priority, and some other with other's lifes priority. So buyers can simply choose lovely car with preferred behavior. So that responsibility will be on car owner.
3. If all cars would be self-driving, than they can communicate with each other (but sadly can't communicate with pedestrian). And they could simply prevent deadly accidents
Moreover different countries and cities might create policies benefiting those, who drive cars with other's lifes priority
youtube
AI Harm Incident
2022-06-08T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxi9edyRH6MBe-gmlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyHzAmobw_w11Mnb-d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxyiKB7SdSvnhxtM-N4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugz4tLk_cr4X5Hr_e5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDx5EZ27hVBDO-G3J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy52cuaNxv0pCVCSnh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbIlfqTdOKTNc9ph14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxh5MKYmtMPkv_l1S94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlSt_4SUBX-NmdT0p4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx2arFYsbTn-QoyeUF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]