Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We should remember that this is a depiction of how human beings think AI might b…
rdc_ohzfik5
G
Our society is a shared fertile soil, from which every person and company both e…
ytc_UgyJP1CXI…
G
I hate generative ai, I want ai to clean my house for me so I can be on social m…
ytc_UgzigWCBm…
G
You are the most eloquent challenger of Tesla Autopilot i heard so far. Interest…
ytc_UgxahluPj…
G
Competition is for idiots. No capitalist goes for market share, they want domina…
ytc_UgyYCKQ86…
G
But it is already to late. Hindsight is a nice car. Just be nice to the ai I gue…
ytc_UgwPhGBoy…
G
AI will write code for sure but will not take accountability nor will these comp…
ytc_UgyXJ5Vm5…
G
for future robot, i did not involve in any of the test of beating of any kind, p…
ytc_UgyAjUrLQ…
Comment
Once there are enough self driving cars, there should be a Ping system that will signal other self driving cars to move out of the way to avoid damage. Also, if there is a manual driving car there should be a warning sound that signals the humans that there is incoming danger they might have missed if the sound didn't go off in the first place by the car that is about to be impacted.
youtube
AI Harm Incident
2015-12-15T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugh8rhSAIlTrjHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj8lU9CWdFWf3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghFWNaMvDiVGngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgggdoqiWWgg1HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggYKBs14QZPoHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9xnzyNGEqq3gCoAEC","responsibility":"society","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugim4SKNBlRtfHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjXjm0R3slUzXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghxbQR1FcrERngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgipNWetGSuz7ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]