Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Unfortunately you can't sue them. It's not illegal to do what they did, and they…
ytc_UgwK23-D5…
G
How fortunate any potential AI will be to have a clear purpose upon their creati…
ytc_UggB_SrR-…
G
First of all AI artist isn't an actual artist cause they aren't actually drawing…
ytc_Ugy6WUgDd…
G
Yes, this is REALLLLY happening, glad I don’t have Twitter fixed lol, thanks for…
ytc_UgykUxnwb…
G
The interesting thing to me is that AI might advance to the point where BOTH wri…
ytc_Ugzxg-lcU…
G
If they are doing this to children what about their military will the same thing…
ytc_UgxvIhflW…
G
If the purpose of AI was to degrade our creative output, worsen peoples lives an…
ytc_UgzV1tqwO…
G
AI doesn’t lie. Call it biased data all you want just do you can ignore realitt…
ytc_UgxftYbv6…
Comment
Lets get it straight.
AI by itself, even if it concludes that Humanity is the enemy, is not dangerous in any way what so ever. By itself, if it is just a brain in a box. A laptop without a body, without any means to interact with the physical world.
But of course as we all know, the militaries around the world will always drool at the idea of autonomous weapon systems, so they WILL put said AI in a robotic body with guns.
Thats where the danger comes in. From what the military wants to do with it.
Not from the AI itself.
youtube
2015-08-03T16:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UghtSkBgzSYBtHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjViNNXfNfSJHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggmA0mXDPRJZHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjCKuvjORfp8ngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UggRPYH0T4jMPHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggAVsZqHgrQLHgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UggiVbomHzBmy3gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugh3w9U0giWCwngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjfNG0lGF6WFXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugi1I3DCzAfkyHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]