Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Until we have unity, meaning, the oneness of mankind, and cooperation between pe…
ytc_UgxojBBEH…
G
I honestly can't think of a job AI wont be able to replace in five years. 😢…
ytc_UgxTsba-N…
G
honestly, i wouldnt even mind if AI as digital life 2.0 would replace (kill all)…
ytc_UgzFYZ2sQ…
G
I don't think we will reached at that point. We are already reaching a significa…
ytr_UgwBxMlYM…
G
You're writing this comment now after the whole studio Ghibli meme tread? Where …
ytr_UgzZv_lSf…
G
As a transitional and digital AND 9 yr old artist , art is js art? No matter wha…
ytc_Ugy1OFYlz…
G
The comments throw up some good points. you should read them instead of classify…
ytc_Ugwvwu3r5…
G
_The funniest thing to me about the extreme 'AI' hypebros is that they are mostl…
ytr_UgzBp51Tq…
Comment
Seeing how many miles have been travelled by prototype autonomous vehicles, the just one fatality and the very few other non-fatal accidents it seems to me that they are already safer than human drivers.
The other thing one needs to ask is what would have happened if the car had been driven by a human in that exact same situation. I don't think there is a solid argument to be made that a human would have reacted better, though of course that is something yet to be determined by the enquiry.
An autonomous vehicle doesn't need to be perfectly safe, it only needs to be as safe or safer than human drivers to be acceptable. Once that is achieved (which is probably already) it will be a steady progress towards being much safer than human drivers.
youtube
2018-03-21T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy0iRDurE5kbF4u47t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxUw_ET2UI1DH2_rDl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzirg4tn7e-n962m7p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxl7LgqTG_8lxdhl_t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxV8ZzJyidy7LHYPH94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxQ7IMYr_Toyf73iRN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwFQTYdACGwhyoXqDd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzexROj7pjdcfp_2Hd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw3XOT-yQgUw7lYYRd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwHFJjq3Ja3_hjQjfJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]