Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Millenial and always polite with AI chatbot ,one day maybe will remember that i'…
ytc_UgxniuFFI…
G
There’s a difference between AI making something and AI being a tool to help mak…
ytc_Ugz015P1d…
G
This convinces me that COMPAS doesn't just need to be revised or "fixed", it nee…
ytc_Ugz3bDTAP…
G
Jump,jump,and off we go!
All I need is a loan from that government you c…
ytc_UgyySUdX9…
G
I just watched a driverless taxi bypass flagged for a construction site and drov…
ytc_UgxeIK_tJ…
G
Does it matter? Well, that depends - was the seller honest about the source of …
ytr_UgxReaDcP…
G
At first, there will be 2 classes of people: The owners of AI and the rest of h…
ytc_UgxgRHOSL…
G
People trying to defend AI art is gonna tell is the same as the cameras because …
ytc_Ugz_nZBEK…
Comment
The same accident would have happened if the car was on cruise control. The driver is still supposed to pay attention, occasional mistakes in software will always occur. Your argument about Tesla deploying the software before it's ready also shows your lack of understanding about the learning process and amount of training data required to train these algorithms. By far the safest way to deploy the technology is by releasing beta versions and collecting as much data as possible, improving the software incrementally. Tesla will without a doubt be the first company to have safe full self driving cars, which will decrease the amount of traffic deaths by a lot.
youtube
AI Harm Incident
2023-01-06T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxIYtMKgczcqi80WP94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgfGFA3gwkcGpjisZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwl_hPP_MFr2jRpgfV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTon5T35b9e_tRRTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzKp3T8pvKbxYZ3EMF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQg-kbAoVjtGZWPP54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzH2uuodXCgwmz0kF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx3FfYVAZsd1eOF0u54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWQxsLxtdvM1qnlx14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmGYRbmFAqcnQm6jd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]