Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I personally dont care make ai art traditional art i sont care all art to me…
ytc_Ugytoscoi…
G
This video was very succinct, good and things I agree with you, though I wish yo…
ytc_Ugy2wFYgt…
G
Take a hard look at your job and don’t lie to yourself, if you work 80% on a com…
ytc_UgwC4GRp8…
G
Here's why you shouldn't be scared about AI
Case 1. They'll never become smarter…
ytc_UgzPwHJCb…
G
there won't be the need for business anymore because there will be nobody to buy…
ytc_Ugyc0ZTu2…
G
This man a perfect example of someone so obsessed with doing a thing they never …
ytc_UgwR3VFGq…
G
Never gonna happen because medicine has a huge human part to it that AI can neve…
ytr_UgzKcbjhY…
G
Andddd what about the existential risk? If only we head leaders that would take …
ytc_UgzBejGo1…
Comment
Well current aerial drones don't make any decisions, all they do is a bunch of calculations to make sure the missile goes where the human decided it should go.
A ground-drone, one for say entry into a building, would have to work the same way. Ultimately the decision to fire should rest with a human controller, and the drone should simply do stuff to make the bullets go where the human decided. That will not prevent innocent lives from being lost, but will be better than "OK drone you have permission to enter the building, it's all you from here, Wall-E".
If the second scenario ever happens then you've got to have a conversation about who's responsible for lives lost, similar to conversations many state governments are having about self-driving cars. Is it the AI programmer's neck on the line?
reddit
AI Governance
1438005284.0
♥ 95
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_cthqlpf","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"rdc_ctht6fb","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"rdc_cthrzq9","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"rdc_cthpo2j","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"rdc_cthubmz","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]