Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI / LLMs aren't conscious. They don't think at all. There is no monster, just b…
ytc_Ugy2uESuK…
G
When we discovered climate change we had people screaming worst case scenarios a…
ytc_UgzR2l4OZ…
G
It's funny how many times I have to tell chatgpt to re-examine what it just said…
ytr_UgzLeAur3…
G
What in the hay? Digital and AI are nothing alike. I’m a digital artist. I draw …
ytc_UgwCQNwnL…
G
Yes: Chomsky criticises the widely used approach with AI that focuses on statist…
ytr_UgzY72yen…
G
What people don't quite understand is that it may be destroying the middle class…
rdc_ohvzb9r
G
As the other user said, it's not that Deepseek is necessarily better, but rather…
rdc_m9hrfgo
G
I personally hate ai art mostly for the second controversy, and i wanna answer t…
ytc_Ugxf0jOvR…
Comment
this can play to our advantage... upon reacting to a situation the vehicle to make the first move emits a stress signal which gives it's surrounding vehicles corresponding instructions as per the signal transmitted... imagine if a collision were to happen... all anticipatory steps will be taken by all vehicles in a choreographed format (like a flash mob)
of course it is possible only if all cars are self driving ones
youtube
AI Harm Incident
2017-06-26T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggQprZepafZ1HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghRftAajpYgC3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiVNu11IS5PiHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugis3gL-vgXrpHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj3h2tFVAqPSngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj5A0pJm2zcoXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggIGhHRenxDK3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjS60trIUKAvHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggCEcSJA552hHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgitMxhB_OZFhXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]