Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Cancel AT&T simple won’t be no need for AI if services are not needed!
*Drops Mi…
ytc_UgyyHDq_z…
G
Jam Masterjam I would rather have a robot go to explore crazy stuff like Pluto t…
ytr_Uggdqnahc…
G
There are still ways to keep the focus of AI on making the world better for huma…
ytc_UgwKRl54x…
G
Fact of the matter, you can't even have something as simple as shift scheduling …
ytc_UgwLW50Ig…
G
@ilghizagreed also chatgpt is connected to the internet aswell so there’s always…
ytr_UgzmbNhbX…
G
Bro Will die in the next few days in a mysterious Way.
The AI needed him away s…
ytc_UgyOgGPaG…
G
It will be good AI versus evil AI all over again. nothing changes in this world.…
ytc_UgyOsJFCN…
G
It's amazing how different the driving experience is when you start automating p…
ytc_Ugyt5mNcI…
Comment
Embarrassing to watch. Every time the interviewer opened his mouth he showed he didn't know what Penrose was talking about. It is true that Penrose's views on consciousness are controversial and may be wrong, but the interviewer seemed totally flummoxed by the notion of uncomputability and by Godel's theorem. It doesn't matter how big they make their large language models, they will still be running on computers, and certain tasks are uncomputable (the clue is in the name), so computers can never do them, full stop. About that much, Penrose is definitely correct. What the implications are, if any, for consciousness is a more subtle matter, but such a shame that Penrose had to waste his time with this clown.
youtube
AI Moral Status
2025-05-03T08:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzxb-o7OZ28YoWQ2b54AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQjTIuc2Uo6_mduiZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxweMcUxdi2fzIHA3l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxDqEj2icriRUdijd54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyuoPuSz5m_ppLFytt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwpn6xdce-CHvHAjf14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxuhepcE7gTd4R1sOF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzHIF4VhKO1AZZAY2d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxKFNK1RH0Kj9B0ERd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxDL8hCuj1kU0TiaNV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}
]