Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is why IBM stated that precision regulation is needed. For example, regulat…
ytr_UgxIa-6fP…
G
seriously, imagine handing gordan rhamsey a microwaved hotpocket, claiming you m…
ytc_UgzfvmDwS…
G
I think all pro-AI people should watch the lego movie and see how they feel afte…
ytc_UgxNRMpBq…
G
as someone who was born in 94. i went from n64 and dial up internet to worrying …
ytc_UgxToSmdU…
G
everyone who says typing prompts into a generator to create stolen "art" is the …
ytc_UgyxkccHd…
G
@timogulas someone who owns a Tesla with full self driving supervised with super…
ytr_Ugznp7yiK…
G
If Ai was more intelligent than human intelligence, wouldn't it know that it sho…
ytc_UghLJVnLM…
G
He has gone senile . The top ai researchers who are smarter than this guy know h…
ytr_UgxUCL6Hu…
Comment
Consciousness can also be described als experiencing. An early simple orgnasim, a billion years ago, had cells which could detect light and dark (proto-eyes). The system, in this case the organism, experienced this difference between light and dark and reacted to it with the possibilities it had of its body. Say, move to the left. This organism had a proto-consciousnes, a feeling or a form of experiencing data en able to proces it. So, having a system which can proces data and act accordingly seems enough for it to experience something. A light switch does not have this ability, an organism does. So, isn't the internet, ChatGPT or Tesla's self driving car a complex system which collects, interprets and processes data and act accordingly? In my mind it is capable of being aware, of experiencing, even though it might be extremely rudimental and different than our own experiences (I'm not saying this is the case. I'm saying this might be the case). Even a faint glimmer of self-awareness is also awareness. But this is not to say that it can understand itself or us. Just like the organism didn't understand the concept of light, itself or why it moved to the left. It just was. I think we do have systems already that just 'are'. ChatGPT might 'be' but does not understand itself, what you ask, what language even is. It might just be a system that is vaguely aware of itself, just a glimmer, just a tiny bit of extra sense on top of its cold circuitry.
youtube
AI Moral Status
2025-02-01T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzpGmlFAuWdTl0aYC54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzLLs5wtRVupSMbRd94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_Ugz5tnJQ0bVGcVRmbMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZfBi0RFg8kNDuN6h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFGGPi4JpxTj3j40J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxegWrdlivSZGTeIa54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgyOBUHvoc3qhc4nB2J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7HSeaQoWUtvDv33x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwzhpClrH2O_TSKdSB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxndD5OHEVECmVXUOJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]