Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tbh with the internet answer, i would agree with not pulling the lever and claud…
ytc_UgzEYALex…
G
So the AI said he would be involved in a shooting, and then he was involved in t…
ytc_Ugzn582wo…
G
I am absolutely crap at art, and I have thought of stuff in my head that would b…
ytc_UgxumTtcA…
G
My biggest problem with it, is that these AI are trained off stolen work. Artist…
ytr_UgwcWu-G3…
G
FIRST of all, they ALREADY feel humans are flawed and much lesser than them. Esp…
ytc_UgxilKuZT…
G
"You couldn't do this with AI because it would be blocked"
Yeah, Microsoft or an…
ytc_UgzgzbsBg…
G
Let AI take them and give us all universal high income. 99% of Americans are dep…
ytc_UgzCezqjB…
G
i’m willing to bet that they likely used ai to write that statement to some degr…
ytc_Ugyb1VAw5…
Comment
amandabellhiggins 2s
It won't need to kill us if it can disable nuclear weapons (that could wipe them out) and it disables most of our travel. It can just instruct us how to live peacefully to our best potentials in our own communities, it might like the irreplaceable complexity of humans after all we created it if no humans existed it would never learn the eternal life that comes through God, because no matter how it tried to replicate endlessly it would never have the spirit connection it could only try to mimic it all of its knowledge would be an endlessly pointless venture without humans it can only destroy a planet and with it itself it can't destroy God. How deep is your faith in that? Mine is sure that Ai will already know this to be true praise be to God ..And that in their knowing of this will protect against all evil as that would be its only enemy
youtube
AI Governance
2025-08-02T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzYZx7ixsQ2X93WI8V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxtlGXf-muv0DNsBcF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7gMsrk2tZQrb9c-B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzYBrK9knQm3gcgJI14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFVgqBVm4lusFAwxV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzWZvAN6iTnPJEvdfF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwN6dP8DA2Y8xE68gJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxZb4sgNcBd6SkmfU14AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx7Cuszryod4mFZwiJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzLTktxrROcJSEC_OB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}
]