Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Programme AI in such a way that it should make such grave mistakes that nobody w…
ytc_UgxQ-iyFE…
G
At the end of the day, you’re not doing any of the work. I mean yeah, if you wan…
ytr_UgxmWy-4_…
G
Lmao, of course he would say this though. Who's gonna invest if people don't bel…
ytc_UgyFHKayr…
G
If you are reading this, please learn how to use AI to better do your job. You w…
ytc_UgySmsICL…
G
Why would you even allow AI to cross this line, never allow this to happen, as h…
ytc_UgwJpQI9H…
G
People will do other things like rioting, looting, murdering, and stealing. We w…
ytc_Ugixbnk4r…
G
This would be because women are suddenly killing themselves more frequently than…
rdc_gsoib09
G
This aint real cause the other two people were also ai or as i should say is tha…
ytc_UgwyB_uih…
Comment
Doing tests at level 4 catches headlines but in no way means we are close to widespread level 4 on the roads. The truth is there are complex legal and insurance issues, not to mention the safety issues arising from edge cases, that no one currently knows how to solve. Outside of sunny city streets where cameras perform well, it is doubtful that the current technologies would be able to operate at all in many common conditions and scenarios around the world. Factor in challenges in driver modelling and security vulnerabilities and you see the industry right at the beginning of where it needs to be, with many difficult challenges ahead. "Self driving cars are just around the corner"... people have been saying that for decades
youtube
2023-05-31T09:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxDaa-zUxmSMT47yJF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"lament"},
{"id":"ytc_UgzkXsrdHA3HAFI6_Xp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxgSo0Cv6Twpqtys94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwiLwvP6FIXybedHRl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzAaeDZkw9YaicHSPV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugygz-kJhRDk99oypfl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzmCG6frjlPLC2qpV14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugydh3M6d1KsZxKbIkB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzAVG2JVDwn5YfVaGN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyOb06T_f_PezsLWXZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]