Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
States that ban the use of AI will end up losing nothing. Even businesses will n…
ytc_Ugx-KQ7cT…
G
THE ROBOT THAT LOOKED AT THE CAMERA LOOKED LIKE SHE WAS GONNA KILL YOU OMG…
ytc_UgzyF35BY…
G
"AI..." *Inhales* "CREATORS....... deserve protection, not punishment"
*Inhales…
ytc_UgxBHDENv…
G
I thought AI could replace human art.
Then I smacked myself and proceeded to wat…
ytc_Ugyk-8b5F…
G
Tay Ai, anyone? Yeah, I know it was weak Ai, but for weak Ai, it was pretty dan…
ytc_UgiV2Fgtc…
G
Hi. Try doing this - it was really creepy. Ask ChatGPT about the Indian so calle…
ytc_UgyGRTsCG…
G
AI can never create anything new. It only works off of the information that’s al…
ytc_Ugyg5lQ1M…
G
If writing propaganda is the best he could come up with on what makes AI so dang…
ytc_UgxcmWmTG…
Comment
arrows and bullets are fully autonomous, have always been
an arrow will hit/kill it's designated target, without asking for permission or further communication with a human controller
the issue is never the "intent", deploying any weapon is *meant* to cause harm
autonomy is a red herring. It's range, persistance and loitering that are the issues.
Sea mines, landmines and punji stcks are the precedents. They lacked mobility, sensors, or range. Focus on this issues.
youtube
2024-06-30T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw97rhpiyFgzSC01Yd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgylDNEt8IOjfP6FNQd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwplf0u7d9FaqIfmTt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw1IGB_IeNATwlKrjN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx2l55P_HMj5grG4uZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxbcjrGqLTfUaGToJx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzM0pFFzSTxCUhRpgB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwyUMqXXNhMxV3Kpyl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzt1_Nkx_Hetq1rNrF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxKeTNzsdybIJ_wkTB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]