Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sorry your wrong.
AI is a tool its not an answer. It requires the correct infor…
ytc_UgwtGJb91…
G
@Squad wipes™ Didn't Will Smith do a robot movie kind of like that ? "I Robot" c…
ytr_UgydbbllI…
G
Tesla FSD is Tesla's fully autonomous driving tech, and it is performing remarka…
ytr_UgygC_qYc…
G
The AI just needs to be programmed to remember it's place. Always inferior to, a…
ytc_UgwicHro6…
G
动态网自由门 天安門 天安门 法輪功 李洪志 Free Tibet 六四天安門事件 The Tiananmen Square protests of 1989 …
rdc_gxrbr0f
G
although i don't like ai "art" and for normal things i don't think it makes sens…
ytc_UgxHDaqoV…
G
Customer service applications? None of this would have started had it not been f…
ytc_Ugz43h9oE…
G
Just look at measles now and how it started to come back more when anti vaxxers …
rdc_g9txq2k
Comment
The scenario is not realistic... the speed of the boxes should be taken into consideration and the car will stop in time without hitting anyone.. not to mention that other cars will give way. Not to mention, there will be no bikes. Not to mention safety distance taken by a robust is far larger than a human, not to mention low speed and safe driving rules programmed.
I would go as far as saying driverless vehicles will be at least 100 times safer than human drivers.
youtube
AI Harm Incident
2018-12-07T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyReg2RJcQbRU8fXqh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFCEuEWdDiAtznUXV4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwg9zPgDxoVHbvC0MR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyNM-AKHsF-2MXWWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx8Kcl0btcr5I4ySJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6kL7XHAZLi2NywJZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgRIau2zSrD54ZIb14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgZBpAS47AyZs-L4R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgfBkZSlB2KJ9236h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgpFP6xoDLiHG7IYB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]