Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I cant wait for the AI bubble to explode.
I'm so tired of that trash being every…
ytc_UgzmQLUDo…
G
I asked Chat GPT about this, Heres the response.
Yeah, AI bros who steal art to…
ytc_Ugx1b5TuD…
G
I worry this might be a double edged sword. A few weeks ago Google released thei…
ytc_UgwIKCBtX…
G
This is another level of scary. Now I realize that I have a couple of AI in my h…
ytc_UgxZizyFL…
G
As a learning artist myself, I think the only thing AI can brand as art is parod…
ytc_Ugyht5MWG…
G
Everyone knows what AI is and could be. On the other hand, everyone is adopting …
ytc_UgyCF0LOO…
G
i really hope that 'ai' app remains up. fuck listening to that japanese animator…
ytc_Ugwi2ey7T…
G
For mobile users (esp. Ibis Paint Users): use the Filter "Noise" and/or search u…
ytc_UgwAr6bgr…
Comment
if the father of AI does not believe that we are created in the Image of God so that would make more sense why he did not consider some of the outcome then. if you don't have a moral compass, you can assume that all tolls and inventions are just that no consequence beyond the moment. but if you think of the different outcomes before the inventions released then you might have more discourse in the case of big failures. you can control what you produce but if you don't think safety and who you is being effected then, we back to square one if not too late for humanity. why don't you go back and work with the government to limit the scope of AI and that should be the limits. Humans do the work and AI do the leg work not beyond what is asked.
youtube
AI Governance
2025-09-09T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz1uFG-IW9efeCPZvZ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGSGuOnk4V6ZBgJ_x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy29aL2w--l1252bZ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzhkIlE1UbvWetcaB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxwxApR_bxW8nKx1wF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy66O5zA0IG1aJpCCR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkDHTbXsW-UUSpWrV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzWW64H-Y-5HCtIRcZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxTjAFYvE1S5yB5B2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx5gG5-c_Zo54KYBBp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]