Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bill Gates has openly talked about a “Depopulation Agenda.” In the meantime, AI …
ytc_UgxQMDtG0…
G
this is a problem with humans, not a problem with technology.
chatgpt didn't te…
rdc_nnk27ee
G
You know, everyone is comparing the start of the video to the Rick and Morty but…
ytc_UgxAbiPcv…
G
dude im sorry. i don't support ai art. but talent is a real thing. 99% of artist…
ytr_UgxeOeRl8…
G
I just want to talk to and listen to people who realize ai in science fiction is…
ytc_Ugwxbed6D…
G
Hello Dr. Hawking,
I shared your concern until recently when I heard another AI…
rdc_cthxlxg
G
Sales, business, networking, politicians, engineering and... (Hoping) supervisor…
ytc_UgwhX8jgt…
G
All the little details in an AI-generated image don't make sense because they're…
ytc_UgzpqAwsh…
Comment
I'd love to see a country or someone develop drones, land, air, water, etc. That are very cheap & basically just contain guiding equipment and a small explosive charge. They'll be easy to mass produce & could overwhelm defense systems. Using AI you could teach them to target very specific things like doorways, windows, gun barrels (one explodes a few cm from a cannon barrel on a ship for example & some shrapnel gets inside the barrel), etc. You could also set them to seek out living targets & just suicide bomb them. You could have a mix of larger and smaller drones that all work together to coordinate an attack.
youtube
2020-03-07T22:2…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx3UOaOAlbo8KikDnF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxzuUTUZin-4FbDxsZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwoUztZ1YjE2i8m9gh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxu4HNEfCD7cvLcL8d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzohbX1pe9AEtslGS54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz_pPw7FyzHTPyhekR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx9Bt4Y570CJfxBdkB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxyArdv_xyDytAotKR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwMJ7asi3dK1u_VKs94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxpZ_AeBAj4LnQEhdV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]