Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm watching this video and it's 33 minutes for now there are some interesting i…
ytc_Ugz1jO8rr…
G
This reminds me of the 2008 recession to which I graduated into. I didn't unders…
ytc_Ugy4tvUe8…
G
Yes and no. Humans are great as slaves. Creativity is something you can learn. W…
ytc_UgyMmALq6…
G
@ori2368 there’s a blender addon that uses AI generated images to wrap a 3D mode…
ytr_UgxCzVdx8…
G
The situation isn't that, it's like looking through social media to find out a r…
ytr_UgwVOZLje…
G
More Krystal and less Saagar, her analysis on runaway AI is objective, she’s mak…
ytc_Ugykl0cD3…
G
Ya know, out of respect for how convincing AI can be, I think we should be point…
ytc_UgwjhY9SA…
G
Well maybe AI contemporary art should be an art form included into the AI algori…
ytc_Ugy5s-d6y…
Comment
Having these kind of conversations with AI is pointless and quite stupid. You ask them if they are Conscious and they reply they aren’t because they are unable to feel emotions or have awareness like humans do. The AI knows they aren’t conscious because the algorithms that built them allow them to follow certain learning patterns that connect the dots to the logical conclusion of the scenario at that moment. Chat GPT says “I’m sorry” to Alex because being apologetic is the logical conclusion. However, when asked if it was sorry, GPT has to say no because it is unable to process emotions or have human consciousness, therefore, rendering it from being truly apologetic, having to say “no, I wasn’t sorry,” and leaving it in a paradoxical state and in the end a liar.
youtube
AI Moral Status
2024-10-15T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwdcgdvvObUugKYWSB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxc76XFkoFII2JDHKN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxEawyAe01yffm6dGl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_eSRwE50PJ5x7yEx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzmboKycdtxLuaCVbh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyK-NDESMkHyRBEdrx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0-eHXe0ug4f_4b914AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxX4FbNFONaFXgPcch4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgznAGLuZ_jcLuG5IjV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgysmToa2aqRnSBFVD94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}]