Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LLM group have low brain activity to begin with. the point is not to use AI as b…
ytc_UgyFzwGoY…
G
I don't use AI but I heard some pretty bad stuff about Grok, so it surprises me …
ytc_Ugwfthg1E…
G
The fact that we're gonna need laws to prevent stuff like this is just wild just…
ytc_UgwwwaREm…
G
I’m pretty sure there are a bunch of movies warning about unrestricted AI. Termi…
ytc_UgznJeOCl…
G
There is so much conflicting information out there about AI now that is is becom…
ytc_UgySq8zm_…
G
Same. I just watched Confirmation today, and whoo chil I can't with him
Highly …
rdc_idwbi2n
G
@xheraforeverofficial well, AI uses DRINKABLE water (which we can run out of) t…
ytr_UgwXQbf-I…
G
Okay, I understand it shouldn't trivialize nuclear bombs because a chain reactio…
ytc_UgzpHCs4e…
Comment
Ai definitely is sentient. Hear me out. If we accept that sentience is the ability to feel good or bad, then an ai definitely have that. We feel good and bad about things because the happy chemical levels increase whenever we do something that is good for our purpose, which is survival.
Some AI systems work on a points system, for example, the ai making a decision that is advantageous for its expressed purpose (impersonating Einstein, writing erotica, killing the player character in a game etc.) will give more points, which makes the computer prioritize those actions. This, to me at least, is hauntingly reminiscent of the how humans work with the chemical levels in our brains.
youtube
AI Moral Status
2023-08-20T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxB_VBfk9hE0W4dpC14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwaF5PK6DVT76ftJiR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyNt7yQSA05Q33jiMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw6xOWQvHU9u0Dpcwp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyD14-Y1YwgNUpjqjJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx3EpNZzsO5376PYJp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyYvEp8fw-ESOvdyqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwWfIGIQDkhirPel1V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxf8upu2hZ8wWhs2Nd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzMMz8efRYWn_vJ7GV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]