Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes but the real world is not a set of standard questions. Although the AI didn'…
ytc_Ugz4pGV2f…
G
I already knew AI 🤖 will cost more harm than good because in a very short time h…
ytc_Ugx-vJ1h7…
G
the whole "mask" concept could just as easily be applied to human perceptual awa…
ytc_UgwA6uLE5…
G
When I came back to China last year, this facial recognition thing really scared…
ytc_UgwwooOvx…
G
The only relevant category is whether AI art is interesting. There's lots of spa…
ytc_UgyQc9l6y…
G
That's because, just like Facebook, it's designed to maximize engagement. The AI…
ytr_Ugw_4BOXY…
G
Well... I sometimes show gratitude to openai chatgpt, but man, dont start me tal…
ytc_Ugx9KMqte…
G
This "AI getting rid of people because it does not need them" is such a stupid t…
ytc_UgxCZdRU_…
Comment
They can’t but it kinda gets tricky if an autonomous drone actually makes a mistake and I.e. targets American ship or something like that
Now the Chinese couldn’t say „oopsie, coding error, sorry”, they would have to lie that this was a rogue pilot but that’s kinda tricky if pilot doesn’t exist and there’s no one to prosecute
So having or even testing these weapons would be unnecessary liability to the owners - those in power don’t want any stupid robot to create a major international incident by mistake so I think this agreement will actually achieve its goals
Keep in mind that world leaders are almost exclusively narcissistic control freaks (why else would you want to become a president?) so it kinda makes sense to not offload thinking to machines. If international incident is to happen they want to make sure it was because _they_ ordered it, not an accident
reddit
AI Governance
1699783757.0
♥ 23
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_k8woe3m","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"rdc_k8wtmg7","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"rdc_k8y4f22","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"rdc_k8wopbc","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"rdc_k8wmgld","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]