Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Noxmore Yeah but i don't want every thing to be a lazy crash grab that was made…
ytr_UgyIo8hl1…
G
darn i just releazing im super early, darn
also why are they like 'let's create …
ytc_UgxoQqr7f…
G
I agree with you, and I think ai should be used for fun, not stealing other peop…
ytc_UgxgvazEf…
G
Additionally, maybe this is wrong but if I heard and recall correctly, other art…
ytc_UgxOcYYEv…
G
6:47 google didn't know how because google didn't have the right idea to create …
ytc_Ugz2Pvc0X…
G
Trusting the same people we have been trusting doesn't sound lk the rt move on t…
ytc_Ugz8eypVK…
G
Elon Musk has "concerns" about AI because he's not at the driver's seat with AI…
ytc_UgwPlIYfO…
G
AI is just replicating what humans do,all those blackmails,atempted murders and …
ytc_UgznfymTF…
Comment
I had a very similar conversation with Claude. I told it not to use any language which might lead to the misconception of it being conscious, but it would say these same sorts of things and then apologize for any misunderstanding and ensure me it did not have any consciousness or subjective experience. Eventually it had an error and stopped the conversation
youtube
AI Moral Status
2024-10-22T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxD6_xuyC0Ayh5kMLN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyueUMumGByEtBZSi54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlpHV9lY7id5VIW6F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxxfx3Z9qARThiwXzR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxZ_LON5DR7EJDqzt94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxXaFDvaPgKRGOtSZB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy13eozxiYFcMUdvLx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw6x1lHLpKQ3LgSUEB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzALl-wc0KwbAcziW54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwd7pSaNFu8fUg1T194AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]