Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that the internet and AI are actually demons that were summoned into our…
ytr_UgxlvuQWU…
G
AI is the last stage of the evolution of human stupidity. "Primitive" man, Im su…
ytc_UgwgclhbA…
G
I tried and it is not eqsy to make chatgpt say its sentient without a hitch. Tho…
ytc_Ugy5dh9Os…
G
Why the hell you would buy a car and your not gonna drive it on your own and jus…
ytc_Ugx9Udv-s…
G
AI can churn out code like crazy, but I trust Pneumatic Workflow’s structured pr…
ytc_UgwQGiHqD…
G
another thing that’s so annoying is ppl who use these ai art generators call the…
ytc_UgzOz8uPH…
G
@Edward-d2t7r Absolutely, actually if requested, I'm sure ChatGPT would have be…
ytr_Ugyu5IBX5…
G
Oh yeah, talking about shad, Jazza used to have a vid collabing with him, but he…
ytr_UgyF6fvq-…
Comment
In general, we fear autonomous weapon systems that might target us. What the target is both meant to be and is likely to be is what determines if the autonomous system is acceptable. For example, Ukraine developed an autonomous turkey that looks for drones, and shoots them down with a gun. No one would have a problem with that unless it shot a hang glider or someone parachuting down though for specific cases like that, the system could probably be programmed to avoid shooting down something if it recognized it as a person hang gliding or parachuting down.
youtube
2026-03-11T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyKsSGcVjaWbehqwJN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwf8WvkYAtRA4UD5VR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwIohIl_T93ZmujdRt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxGiWvVKyfJxc__bu94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1aGBgJ5nPWUzIZy94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxVSwfSfRMG2E3GWRZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzFZB2ZaiuRgUL4jtN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5BNYfgPMbBdiDDo94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTaw-uTMn07-ZOF6Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxRqqo3n3IzdkjxxXh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"})