Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI unaliving people will be considered an industrial accident, treated as a civ…
ytc_Ugzg2G4J7…
G
There is nothing creative about deepfake videos, it can be disastrous and must b…
ytc_UgxXNYmK0…
G
How does the world economy function if most people can't work because their jobs…
ytc_UgzerGSBB…
G
As an artist i 100% don't like the idea of AI in my work or in the talent indust…
ytc_UgzOVnq4u…
G
What makes me more upset is that they sell their AI generated art, if I wanted A…
ytc_Ugw9icJhz…
G
A vision of the future?
>\[User\]: How do I learn to relax?
>
>\[G…
rdc_jhf7pm0
G
A. AI with citizenship. 100%. Only human beings can get citizenship at per artic…
ytc_UgxzOJaOl…
G
Omg yea that's real smart getting into a ring with a MACHINE THAT FEELS NO PAIN!…
ytc_Ugz314kBb…
Comment
Hinton doesn't seem to believe in the soul. AI will not have a soul unless God gives it one. So my self awareness involves my faculties (my mind) interacting with my soul. My faculties are intellect, imagination, memory and free will. Will AI have free will, and if so, how will free will manifest itself? We'll, if AI decides to take out humanity, that may indicate free will. But will that indicate that AI then has a soul? Ah, I think if AI decides to end us, it indicates the devil took over AI. So I don't think God will give AI a soul, but the devil may use AI as a tool to ruin humanity. And the devil does have free will.
youtube
AI Governance
2025-06-21T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz8bUIchmV7y-9ewIt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyEecRdEvwTLLN55994AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwgiH1f4tmHBW83uKt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw4KNxDTbBImtkU_u54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGn3rz1HNSU8omN7d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuCeXWzDHuQ6UL-wN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYrh8KgJlKbcUHKxh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyClAg3id8ySighr6V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgylYnR-bmD4NMmWrt14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugzzp-eBA5TBamDe0J14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]