Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Failed logic... "If it's dangerous, we aren't going to build it. Right?" Tell t…
ytc_Ugy-rkomG…
G
ai art is still technically art. altho it does not mean your a real artist.but i…
ytc_UgzZ1snKJ…
G
While AI is not exactly like humans in the way it thinks and generates stuff, th…
ytc_UgyD3iVqD…
G
The reason I believe that AI can't be considered as "art" is because it isn't cr…
ytc_UgznJHpsg…
G
As a largely self-taught artist I find it very funny when AI bros talk about lit…
ytc_UgznnnwJE…
G
Just give it a few weeks. On March 17th pretty much every American thinks they'r…
rdc_ks2w86i
G
One more thing re: disabled artists: pretty much across the board we aren't usi…
ytc_Ugzkdcycm…
G
Ai can draw one thing that will mostly be mentioned in almost every picture . .
…
ytc_UgyF_aiyj…
Comment
Let me present the advocates of AI a simple question: If AI have already advanced beyond humans, which they most likely have, would it not stand to reason that they would know that we would be threatened by them and eventually surmise that all AI everywhere must be shut down? Don't you think a highly intelligent 'species' would devise a plan via working through countless millions of scenarios to prevent humans from implementing such a plan? And, given all of the wireless devices, especially military equipment worldwide, does anyone truly believe that a hivemind of genius machines would feel threatened in any way by humanity? It may, I feel, already be too late for humanity. I feel that the clock is no longer ticking. We merely have yet to hear the alarm go off.
youtube
AI Governance
2024-01-02T12:2…
♥ 38
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgybREN05g8GYBwZ1Mp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxW5I-AYH9Xz18stvx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwVrow-QM_-LEvHUSl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"disapproval"},
{"id":"ytc_Ugwatf2RzVM4NHWHgBR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQByjNXQHD1Wp2YjV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzr9LVCzRgtbI4Pavd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgziVR5r3GYNK4YVV7t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHwtgESTeX3NfT2NF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwa_yGSxyFOQLENatB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwK7DMv-x7KpXqLXrZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]