Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think it can make sense. AI to me is alot like photography, the vast majority …
ytc_UgwwlJ-Ua…
G
Have anyone noticed that every time you ask ChatGPT about a topic discussed in t…
ytc_UgwRb4Fp_…
G
AI: *Describes a form of lying to gain power*
Also AI when asked if the roboluti…
ytc_UgxDtA85I…
G
Let them go gracefully. We'll be reborn in the AI. No body and no emotion.......…
ytc_Ugw91ULql…
G
This argument would work of AI didn't steal from real works of art made by other…
ytc_UgzQ5j7Fp…
G
Tbf we have no way to prove if these "conversations" with Bing are true or just …
ytc_UgwAbpNm3…
G
I know of four-way intersections where people have been hit and killed while fol…
ytc_UgxKRzg_e…
G
I knew it!!!! I'm sending this to all my work friends that tease me about being …
ytc_UgwxC0RDf…
Comment
I think it is a higher chance that an early version of AI, (that isnt really true AI), gets in the hand of crazy religious fundamentalists or a crazy dictator that use the AI as a weapon against whoever they dont like and it start world war 3 with an early version of AI that is basically still a computer program and not real AI that can think for itself. What Im trying to say is that I think its far more likely the human race will wipe itself out before we manage to create real AI. Maybe we will use an early version of AI to do it, or nukes or a combination of both.
youtube
AI Governance
2025-07-03T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwgaIXeQ9zpmnJTR2V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzsVPtcQXkfioJ0fSB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7VsXvGEwsJL6BuJp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKqGo3Q5v9-eoIc9V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz080KjOalv8ThUUoh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzY5EfT48fg6vTzWOh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVv0vueSzBI2d2Aop4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxib9CH84ZbPjwyO6x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwx8ZT_z6-Ewi_FVrt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxMaAxsOMdtNBVGRW54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]