Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its time to become self sufficient and eventually disown money and technology, w…
ytc_Ugx-5ND4D…
G
It's clear that cell phones and internet are detrimental for kids , young people…
ytc_UgxOrkuE1…
G
Don't be fooled this is happening at NZ events as well.
If artists don't spea…
ytc_UgzQDUypQ…
G
This is one reason that I will never ride in a Waymo. I don’t want to be a test …
ytc_UgydbOTXP…
G
Noname-mi1oo it might not be my problem but, to me, it feels soulless. Where's t…
ytr_Ugwfi8aRC…
G
I don’t believe these AI creators actually understand what they have created. H…
ytc_UgyzWi_e5…
G
I got...rejected i got rejected by the art community...i hate those Ai...people …
ytc_UgwkwCTxs…
G
Omg. Now we know why they were in a self driving car, because they’re too stupid…
ytc_Ugyz4PrjC…
Comment
I think they are more afraid of countries sending wave after wave of disposable soldiers with "kill everything that moves" type orders", and those robots continuing to follow those orders indefinitely even long after the people who gave those orders are dead. Kind of like that video game "Enslaved: Odyssey to the West." All the governments fell apart centuries ago, but the robots continue stalking the land killing and enslaving everyone in sight because it was the last orders they were given. And there is nobody left who knows the shutdown code.
Personally, though as a programmer I would create them anyway as I see that scenario as unlikely and I can think of a few ways to guarantee that wouldn't happen. (i.e. require the robots to receive a "continue" signal periodically or automatically shutdown)
youtube
2015-08-03T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UghtSkBgzSYBtHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjViNNXfNfSJHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggmA0mXDPRJZHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjCKuvjORfp8ngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UggRPYH0T4jMPHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggAVsZqHgrQLHgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UggiVbomHzBmy3gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugh3w9U0giWCwngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjfNG0lGF6WFXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugi1I3DCzAfkyHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]