Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use google when I know what I am looking for. And Ai when I don't. You have a …
ytc_UgzDldk5o…
G
Billionaires have accomplished that already .. The middle class is next .. tech …
ytc_UgxBOSxa6…
G
They’re going to be fine with AI in control of war time decisions until the AI d…
rdc_i2smyft
G
Ubi needs to be implemented very soon. A new economic model is need. One where h…
ytc_UgwaLXnn6…
G
In Europe is way different. Chatgpt even stops sending me replies when I insert …
ytc_UgxS_7rBE…
G
Anyone depending on AI to help determine what is wrong with them deserves to die…
ytc_Ugzf2_vGi…
G
And yet the AI provided a higher quality work than all the other artist shown…
ytr_UgxvP_k8Y…
G
The problem lies in the AI users rather than the AI program itself tbh, cuz some…
ytr_UgyyH8vrS…
Comment
No chance..need a human brain to make that subjective decision whether to swerve or stop in front of the object and not cause further damage or a knock on affect accident from behind or from the side. Need a 360 view 🧐🧐in order to assess the variables in order to make that SPLIT decision..what about avoiding someone that’s drunk or road rage ramming of your car or a tree falling or extreme weather event...or a pothole barely visible...serious work on all the above algorithms🤦♂️ cars on trams will come first before autopilot
youtube
AI Harm Incident
2021-06-06T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugza7iydaj4dy-NF0T94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVsXyECLUuYoXTco94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxUQC6jqjortfT9us54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyjRd-ObvHtL-Nodh94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJP_4d4HPdENID9Np4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyyybX9PrrkfE2ZHTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCRgJWC2ssZqiPT0l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzvlm-_fwD3TMiQHep4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyHfwUiIdLnxRLAfmJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyU3WZEMu0FBLBaS6h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]