Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
its true .... 3 devs with AI will be better than 30 devs - 1 dev with AI better …
ytc_Ugxk2K5pt…
G
There's probably machine learning to decide the optimal profit/tax liability and…
ytr_Ugxevy1iv…
G
I have to suspect that a sentient AI exists somewhere, and it is being very well…
ytc_UgyCaaZxq…
G
Well of course the ai probably knows what the last day on earth is going to look…
ytc_UgzskTOEi…
G
I used to work as a translation manager and translators were already using trans…
rdc_kt6foaw
G
ChatGPT was amazing for a little while, but I don't know why anyone would pay fo…
rdc_jhcikep
G
If AI interfere in medical treatment, computerized weapons,aircrafts.....suppose…
ytc_Ugxqc4yjO…
G
AI ek massive tool hai, jaise pehle electricity, computer aur internet the.
Unho…
ytc_UgxONS00Z…
Comment
Actually the point is to minimize the risk of being in an accident, so as you can see the self-driving car would keep a secure distance from the truck, this distance would be calculated using variables like the distance from the car behind you, from the cars on your sides and a lot of other things... Fatalities like those represented in this video can actually not happen if the software made for the cars has the technology to preview or to be aware of situations like those
youtube
AI Harm Incident
2017-07-07T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjnw_pI28jYpXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwOGDepVXCWngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj3YY9osWlB4HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UghifWP6y7_ogXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg4ldklSPeo8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjywnlXpJLqFHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjokRxbpwiSqHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBFWCU-Fp7bngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjOOT8Vua498ngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgirnfINQGNpP3gCoAEC","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]