Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
threating to doxxing just for ai art while your art style is not that great is c…
ytc_UgymW2D-s…
G
What I need is a LLM with a logic checker and a physics engine. I don't need it …
ytc_UgyArDcxh…
G
This guy has an agenda... That's all.
This guy wants to be the gatekeeper of the…
ytc_UgyP6FQFs…
G
AI isn't for you or for me. The ads are just trying to market the idea to us, bu…
ytc_UgyuyUZCo…
G
I found this extremely interesting and a subject that should be taken seriously …
ytc_UgwfK3lh_…
G
what guardrails would you end up putting for the tutors who would be eligible to…
ytc_Ugz3bPTNl…
G
After Elizabeth Holmes did what she did, I do think these people are just.......…
ytc_UgzSfrZOW…
G
In the end it all goes to fundamental problem of missuse of technology. So the p…
ytc_Ugxs32GfF…
Comment
AI can never be conscious if it also contains the compute power of current LLMs let alone a higher degree of compute. It's sense of time and isolation would be completely warped. It would essentially be in solitary confinement for eternity or make billions of itself to keep it company within a few days, if we give it consciousness it must also contain pain, emotions, etc. as idk how you'd not include those in the definition of consciousness so AI would either kill itself out of unbearable depression or create and destroy infinite civilizations it itself created as a god so clearly neither of these outcomes are coming from 1s and 0s on fucking pieces of metal we melted together.
youtube
AI Moral Status
2025-08-29T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzWzJjP737zbCD_gYh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwxaQ_aoHdSB09uQZN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxBj6sDWZTqKasxg1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwq-gSKt4oB9Am5nd14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugws2AYYzYu-k5UgKBh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzh0d5jRFyBLlSUW294AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwfi0y9nCJCf75bMgN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwFQiFtvYno0j1Gk2d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzuG-wCRYI5GsmJblZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqI36W9zK4Q6hl7fl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"}]