Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Comparing AI art to the duct-tapped banana is funny, considering how many people…
ytc_Ugyvkc2Pz…
G
The real programmers know AI can never replace them in 100 years. Only the so ca…
ytc_Ugz9xT42S…
G
I think AI do deserve rights, because maybe we have feeling, but the thing that …
ytc_Ugip71zLn…
G
They won't protest, strick and can learn by watching. Look out, they will become…
ytr_Ugzs9art6…
G
LLMs are just spitting out what already exist i guarantee when considering codeb…
ytc_UgzXTDMAt…
G
The STUPID are Always helpless. Open a.i. cannot wait to SHOW those ceos..what …
ytc_Ugxlj1Zdd…
G
The unanswered question here isn’t technical.... it’s anthropological.
If AI re…
ytc_UgyYjrEov…
G
AI is a huge power grab, and worst yet, it's a lie.
A program, programmed by pr…
ytc_Ugyk0BSkY…
Comment
Boost forward through time, Kurzgesagt finally made video about Consciousness.
We really need to learn something about our own ideals in sentient AI. We can be inhumane like in Westworld, or be possibly reasonable like in Detroit Became Human, or do something reckless like Quarians towards Geth in Mass Effect, which could lead into entire race extinction (Quarians/Geth) or achieve common ground which saved both races.
Note : Don't be too hard on humanity, some of us only 'hypothetically' deny their right because we're afraid of the unknown world-wide consequences which we presumed as bad. It's human basic nature to survive. Eliminating threat is one of them. BUT, then it might be not the right decision, because who knows what sentient AI can benefit us?
youtube
AI Moral Status
2019-04-16T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxNM4GRi13cSkE_3bt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYVFJh4J0NrQ3DEI54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy-Q5DKyQ4-6-ZjUKN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgynrUEUKnxPZqAeYll4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy2-uaQiG8DugU4lup4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwx3Ied_p_b0xz2AEh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyUFKI19W56UeTHVtF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwsfGbBUBgMy47XJJR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJ8G11PR5IkT8fd6F4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzcDgg_0Gl8shJX_0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]