Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No, they invest in AI because the other guy might. That's the same reason our g…
ytc_Ugzs_U24i…
G
Did ChatGPT write that stupid argument that criticizing the use of AI is discrim…
ytc_Ugy696j5Y…
G
hey im currently in my second year of uni and feeling alot of imposter syndrome …
ytc_UgwccEaKO…
G
People will really stretch to justify their own use of AI. I've seen people argu…
ytc_UgzCXiWl8…
G
@chaosfortytwo Thats about as ethical as you can get with ai. I'll never agree w…
ytr_Ugyd9st5X…
G
As someone who has long held the position that AI will almost certainly destroy …
ytc_Ugy0XrPpV…
G
I have a hard time believing ai could possibly get more evil & machiavellian tha…
ytc_UgweB5_gs…
G
AI designers are not like plane and car designers. These people are misanthropes…
ytc_Ugzm8F22H…
Comment
When he said that "As ab AI, I do not possess consciousness."
So here as a programmed AI helper for humans, he will do what it makes for him to be able to atleast give a neutral or friendly touch whenever the human he helped felt like somethings wrong or bad. So to say "sorry" as an AI is only because they programmed to be so.
youtube
AI Moral Status
2024-10-02T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugw1QKiJ8KQ5EicXbPB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZn7I-H_uLtPOXPeV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxku-0QOsPBy9plma54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCbAA2u9Q1vxTxKbl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw3TbsTgaj8iX8tsw94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw6zGGZlkiROiubeIN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylCEEnBmBSlBNN81R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySDWo02Z2p4aVFDbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzgQwGLthNLBoYOJnp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWt9MuVc17ylgqMhJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}]