Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai always gave me a unsettling feeling whenever i looked at it, this is really a…
ytc_Ugz_A1Ajj…
G
Ai took over along time ago, we solely rely on it most of us would die without i…
ytc_UgxA97lG-…
G
Wow! My head is spinning. I spent a number of years as a controls engineer on bi…
ytc_UgyUnfD-R…
G
Teaching AI the concept of free will can lead to more nuanced decision-making an…
ytr_UgycLA1rV…
G
Hell he the crazy on I’m not bout to fight no damn robot 🤖 Gots to be smarter th…
ytc_UgySCVcqr…
G
AI slip into personhood as easily as humans into a hot tub. We (humans) should b…
ytc_UgzT1bhXl…
G
https://preview.redd.it/z4kfevq0r5xg1.jpeg?width=1402&format=pjpg&auto=w…
rdc_oi14z6p
G
Art is not just a drawing or a picture, it's the meaning, and the Fontaine (urin…
ytc_Ugx1jFiDq…
Comment
The only real benefit I can see from autonomous weapons is if humanity ever faced an external threat—like an alien invasion—and we were forced to unite as a species to defend the planet.
Outside of that scenario, the drone and robotic systems we're developing should be used for exploration, disaster response, and operating in dangerous environments—not for automating warfare.
Turning war into a competition over who builds the best robot just turns conflict into an exaggerated version of BattleBots—except this time human lives are on the line.
youtube
2026-03-10T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyXGyI80KEqZkRtjvl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyUn52dduYNG9-TavZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwW3lzKWPa_SnR44_N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzE9Z23qQGwrL1y9s54AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxmdbCKCYzl0EVGc4J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgysoQvcvl06XLnUV5F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz9O49ATgCr8EQit314AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRVYIIU-sMPdHX4th4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyh1lwmnMfmsjwcdKR4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyXGOecR1qWrHxOlB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]