Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We appreciate your perspective on the AI's responses. If you're interested in mo…
ytr_UgzSAqcNL…
G
can we make an AI to replace the person whose decision it is to weaponize AI? be…
ytc_UgwP-rYwj…
G
Could AI end us as a specie, and if it does, there is a way to survive, or if i…
ytc_UgySbmxpM…
G
This content protecting stuff is total B.S as humans we learn from other humans …
ytc_UgxeioAnN…
G
Human art is just that, human, and AI just isn't comparable. I look at actual Gh…
ytc_UgxYjPzXL…
G
I hope its not too late to answer the 2019 question 🤣😂, the answer is B, robot w…
ytc_UgyhVNqyR…
G
Just because you can doesn't mean you should.Nobody asked for self driving cars.…
ytr_UgyqlNv3W…
G
I see personally see stealing ai generated “art” and redrawing them as an honora…
ytc_UgzXYW61D…
Comment
no one wants f*cking self driving cars! stop wasting research money into IOT. pure sciences should be the more donated fields.
people! if self driving cars become a reality then soon human's driving will be illegalised!!!!!
you won't be able to drive!!!
no one wants that!
the adrenaline in your veins wants to drive a f*cking car!
youtube
AI Harm Incident
2017-06-29T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggQprZepafZ1HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghRftAajpYgC3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiVNu11IS5PiHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugis3gL-vgXrpHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj3h2tFVAqPSngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj5A0pJm2zcoXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggIGhHRenxDK3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjS60trIUKAvHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggCEcSJA552hHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgitMxhB_OZFhXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]