Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even before AI, most if not all of any sizeable corporation requires very inhuma…
ytc_Ugy0N8NGo…
G
Cool topic ! I can relate to, the thing is AI speed things up. I am running a sm…
ytc_UgzzaYWHe…
G
All art guys suck. When we engineers work hard day and night and still many of u…
ytc_UgygGXp6i…
G
Question is would AI let regulate itself now if we will regulate it. We can do l…
ytc_UgzyCFl1X…
G
Ironically, they still don't actually understand "anything". They are still jus…
ytc_Ugw0tqu-1…
G
AI doesn't think, it processes. There is nothing behind the mask. This is a refl…
ytc_Ugxaq4SMx…
G
Humans, on balance, don't keep their promises or live according to their profess…
ytc_UgzrFELr8…
G
There’s not enough compute power or energy to support what would be needed for A…
ytc_UgyA0dm3r…
Comment
Yup, both creepy and cool! I think you can't come out with technology, then make it disappear. This progression to drone warfare, quadcopters with payloads, and automated killing/defense machines is possible with today's tech. It must be developed and refined, and hopefully_ hopefully_ not turn on us and enslave us all. Check out Black Mirror "Machinehead" season 4. It's the best I've seen so far.
youtube
2018-04-03T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwM5aZIxWW4j5iDuXx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxl09f6L7Rj-RKTPZF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzieqUhMRMtOB8uQh14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzlxnxQw99WAmfHehF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzZoRdIrjkSS-bAyZ54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyqukX4nqlg8PxxK0F4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzjQP_1zTltW_9IU5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwH2zhdVVSb5TK2kxh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyHwfpnimaOHxKVZVF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwpUlJEuK97Bnz92U54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]