Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
speaks a lot of sense but lost me with simulation theory - suddenly made it easi…
ytc_UgydvWxjc…
G
We understand your concerns! The idea of AI can be quite daunting, especially wi…
ytr_UgyGq0txB…
G
You used AI to tell us that AI will destroy us all.....Why would you do that.…
ytc_UgyaL_9AL…
G
I had a conversation with ChatGPT several weeks ago. A diary style entry. Nothin…
ytc_UgyPSSFaX…
G
Oh, I’m sure there’s many of us out here that are working on Waze to defeat AI b…
ytc_Ugz5DfPRq…
G
Look, I might get a little critism in this, but I don't hate on AI completely bu…
ytc_Ugzw8nVCS…
G
@user-vz2sy7tx3wobviously this is fake. But your argument is kinda dumb and sil…
ytr_Ugz-E2CUs…
G
Plummer, fireman, maybe car repair, mountain guide and recovery, plane pilots, I…
ytc_UgxcQAaq1…
Comment
I'm a developer. Computers will think in a logical stand point and say this is found. However the programmer of that software will probably get it to analyse other things before the kills such as face, area and other things before it makes the kill to male sure its the target. But however because of terrorist groups getting them it would way to dangerous to have which is why I'm against them. However they could have a self destruct button on them that explodes automatically if something happens. But I'm just against AI in general besides soft AI or AI that doesn't dictate someone's life
youtube
2015-07-30T03:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Uggfmjc4dajsu3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgjLaVRywu1KoXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugghx3Nm4RuttHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Uggk4mR-nlY243gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UggYbY1ME8kwQ3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjQhookpLNxr3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjtLGr3PIz9P3gCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UggE3oe1ExSrOngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugjiws5jvbtj-3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiaClDbKuhuMHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]