Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont want AI friends, I just want friends. If those have to be AI, then so be …
rdc_mtyjcgp
G
The only way you solve the deep fake sex BS is too deep fake politicians in thes…
ytc_UgxfwPjWS…
G
@TheUtubers it’s ok. I mean maybe I agree with the part where they shouldn’t get…
ytr_UgzGLcLfN…
G
Ai will simply be used by the powerful to subvert the weak. The greatest "issue"…
ytc_Ugx7-MKaM…
G
it would be reasonably simple to execute equations as they are found in text and…
ytc_Ugz5grv0P…
G
Spontaneous guardrail failure is the "Black Swan" of AI in 2026. It’s not about …
rdc_oi25lks
G
This is a truly profound and complex set of observations and questions. Thank yo…
ytc_Ugzwdc09z…
G
Nah, you're wrong, 'cause all the non-programmers are saying you're just coping …
ytc_Ugx5Ek1op…
Comment
I would agree that chat bots are currently just sophisticated Chinese rooms, but I wouldn’t say it’s impossible for AI to become conscious when we can’t even agree what that is or explain how it works. It’s not like we have a definitive consciousness detector and can say yes, present, or not. And Penrose is basing this on mathematics course from what 50, 60 years ago? No computing developments have been made since then?
youtube
AI Moral Status
2025-05-25T10:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxeJDgt2w46B-__j9p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgynSb31DPL8DKjP4d14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxyk_e_B_-hr4HN70Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwfI1fIn9-hxbqFlXV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwYkGnyGQuZr83FzrZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxuKoiLSEK8W6OOxF94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPP_CljijQIrnwK6J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwlw7F_kPiDdybYycZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyk2u8qOsp5Hg26Ath4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyroXGlKbAWwDokHWF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]