Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"The Carl Marx of our time" ... lol ... just too fitting for an over-educated Ha…
ytc_UgzXU0Z--…
G
Why can't we build an independent AI whose job is to observe other AIs and make …
ytc_UgwtwMXHW…
G
I think LLMs will continue to have a place, but when used for things like orders…
ytc_UgzlTtlIJ…
G
Senator, I hope you see this comment. The more pressing Issue right now is Job O…
ytc_UgyM2fO2o…
G
My good people- do you want to get rid of AI once and for all? It’s gonna take u…
ytc_Ugzhhtqnq…
G
In the not-so-distant future, the world had become increasingly reliant on advan…
ytc_UgyHWz8MM…
G
@withadan2212 Some of you folks have a very, very futuristic idea of how this is…
ytr_UgzT0lNvu…
G
I always say please and thank you to AI because they put in so much work to help…
ytc_UgxYS-bNJ…
Comment
Humans are biological computers because we think and feel and when young we are consrantly recieving information which can be considered as programming or data input.
Difference between a human and a computer being computers are unable to uave emotional responses, only preprogrammed ones.
People that think AI will hecome sentient are silly, doesnt mean it isnt dangerous and daadly in the wrong hands.
youtube
AI Moral Status
2025-09-01T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzWzJjP737zbCD_gYh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwxaQ_aoHdSB09uQZN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxBj6sDWZTqKasxg1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwq-gSKt4oB9Am5nd14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugws2AYYzYu-k5UgKBh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzh0d5jRFyBLlSUW294AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwfi0y9nCJCf75bMgN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwFQiFtvYno0j1Gk2d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzuG-wCRYI5GsmJblZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqI36W9zK4Q6hl7fl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"}]