Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Their safeguards were meant to protect against the very things that are being sa…
ytr_UgxjIK3FM…
G
I think there is a bad science test that he or she make people an ai robot😢…
ytc_UgzvvHRSO…
G
I saw that requested to Ai was requested for effect, just like you can reques…
ytr_Ugwc6-hzt…
G
These predictions about 2030 sound exciting until you've read Selwyn Raithe's bo…
ytc_UgzrZZgOB…
G
I love/hate that although Fox and CNN have both done good by airing important AI…
ytc_UgzxuJ6g9…
G
There is also a chance AI realizes there's no need to kill off our species 🤔…
ytc_Ugxc1DMcT…
G
This is a big problem, a lot of the top 10 in the last 20 years has been written…
ytc_UgzjVraKT…
G
Change will always be. Generations get smarter and create things that change the…
ytc_UgwrN-cne…
Comment
i normally agree with sir penrose but on this one thing i disagree i think they will create a form of consciousness, perhaps they already have, but i also agree with sir penrose although not with his wording, AI is intelligence without WISDOM this is what i think sir penrose is ultimately getting at, but i am leaning towards the possibility of AI becoming a form of consciousness without life, not having wisdom, but it is indeed intelligence.
youtube
AI Moral Status
2025-05-17T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxPxdsOccwjblTWIux4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxGglYuPJIY_U8E6iF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgzERewr9YcAXsNJiUB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgyAIyUFQ_YzlTC376l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgzunjbVxJdpGcn5G9N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"},{"id":"ytc_Ugw3t6JnUnB7bFsVf6B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgySTW2DeaDdmP3D88d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwipDQOEyXLDOWIdKt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyfJXIuXxIQ3tkEl9N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugy9IoEJCS4lAiRQer94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]