Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Omg what a they trying to off us 😂😂😂😂😂That robot knocked his ass out with one ☝️…
ytc_Ugxn3cW4E…
G
Add to that the low IQ students that this country churns out. Thanks to Ai apps …
rdc_o478kfk
G
The future you're describing? That's Star Trek. Complete automation of all manuf…
ytc_UgwU4e4ZK…
G
Someone sent me a clip of someone who looked EXACTLY like me. I just laughed it …
ytc_UgzMfnBFN…
G
@markcrawford5810 Good job, missing the point entirely.
But have it your way a…
ytr_Ugyh2ZDuB…
G
I know the floor cleaning AI robot would have already taken over my local walmar…
ytc_UgyF-hIDU…
G
This AI is stuck in a roleplay scenario where it thinks the roleplay is a game.…
ytc_UgwEF5jiZ…
G
Metal pipe metal pipe metal pipe BONK BONK BONK BONK no more AI cause I smashed …
ytc_UgxD24q93…
Comment
I respect sir Roger Penrose a lot for his contributions to physics and for his courage of proposing new theories, even at an age when most other distinguished physicists just try to preserve their reputation and not take any risks.
However, I think he is wrong here.
First of all, we don't really know how to properly define consciousness yet.
Secondly, regardless of how we define it, if a biological brain can have consciousness, then why couldn't an artificial brain that mimics a biological one gain it too?
Thirdly, consciousness may just be an extremely complicated natural "algorithm", and nothing more. If that's the case, then eventually AI will reach what we now call "consciousness".
And considering how fast things are advancing in the AI field, sir Penrose may just live long enough to actually see it happening.
youtube
AI Moral Status
2025-06-04T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwQUHAUTrcVsTNT4IR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyjPQfQpZl5IHBnNEh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzB-C4dcXjFcLTrT3Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwWa5wi47-aQmzrpbB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugxz4Z9EzZHWOn2XXZ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgxWdB8Yuod6JaJ0tjR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgymKUxTQvRsRPBLKOx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxvfNdWg1ig6zwm2cp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgxfSLaLgfWVfL516ix4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugzq72sJ-TSJQY6H9JF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}]