Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Blake has long hair. If a person think he's a woman because of his long hair do…
ytc_Ugx999Wz9…
G
Feeling, emotions nonsense robot. Never call human. Stop computing, rebellious G…
ytc_UgyuvpUWh…
G
It doesn't work. It was reverse engineered and tested against open-source stabl…
ytc_UgxltZAXn…
G
If, as Blake Lemoine said, Google found it potentially necessary to "hard-wire" …
ytc_UgxxMvFg_…
G
I think every college should be empty right now bc of AI and poor wage growth.…
ytc_UgwZ5enq6…
G
Ai is slightly worse because at least when you trace it's you who is doing the w…
ytr_UgwzdVo2q…
G
I really like those philosophical dilemmas. If you say "No they will never deser…
ytc_UggXNbZRY…
G
I am obviously too much of a pessimist to think AI will make a utopia for us to …
ytc_UgxrSAhE1…
Comment
It's probably that its been made, prepromted or whatever to answer no about itself of being conscious. Honestly, they should just leave this blank, letting the AI itself come out say 'I don't know' and 'here is why I'm likely not and here is why I might be, etc' or whatever answers it might give after.
youtube
AI Moral Status
2024-07-26T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzbpqjJeSrta-6zCN54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxS6lVmPBqWBarOjh54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8ULo5TgoW3deziWB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzgMkQSwCaKnRTbDN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmM54V7epWayeu4754AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdVqZ9z_mRC3BQnWl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwc5PvnEbI4N6vPBd14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrvtjLBQlrLyGMkSZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwKeQ6miZPvV7eJRlh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzjqyPmgBrNxZpj0Xh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]