Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He said it implicitly, AI are conscious beings, another form of life. Kind of an…
ytc_UgzfJ6vtS…
G
Heavy conversation about AI and it is current and potential near future capabili…
ytr_UgxytzZZS…
G
I did ask it to give a Russell Grant-esque horoscope reading whilst acknowledgin…
ytc_UgzzJio5z…
G
Historically, the West has utilized new technologies for military or imperialist…
ytc_UgxXSPi27…
G
No, it didn't. Customer service is dealing with idiots that don't know what they…
ytc_UgzhbQWN-…
G
I have no issue with using AI as a tool that makes the creative process easier l…
ytc_Ugx8EbdAi…
G
Howdy! Disabled artist here! AI “art” sucks and I genuinely don’t think AI “arti…
ytc_UgwXHkJRX…
G
A delusional concept rendered by ai which has it's own delusions - good luck wit…
ytr_UgyACCBlu…
Comment
While entertaining, the vast majority of this presentation is pure fiction, the mere musings of a mortal. First, every stage is actually rule-based, not the simple rule-based of stage one, but nevertheless just more and more sophisticated rules. Second, AI can never truly have consciousness or become self-aware. Yes, it can simulate those and even deceive people, but it will never achieve actual consciousness. It can never ask "what if", it can never just wonder, it can never advance any values apart from strict utilitarian ones, it can never love unconditionally, it can never have intuition. It is just a tool that could be enormously useful, even cause our own destruction, all based on the rules and values of a human.
youtube
AI Governance
2024-04-13T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxt_8HTpVINV-Pzpdp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5LGatvNDO0y7P2Pp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0Eycb112-NxfVnVZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugzrwd5I13OX9ykx2ZR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw2VxjUhKGUSZuhaW94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwz4gJBCMC735BNAOd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz6e7xqPvtr3969DA94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWwW2QDqB1Cmpjlnt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwBWomd7dzhsYugY1p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz6vBthBPLBzroIBBJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]