Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I used to like shad, but he shouldn't be supportive of AI, since right now, bots…
ytc_UgyVcoTjl…
G
I thought this was gonna be about how saying please and thank you costs lots of …
ytc_Ugxw6FmlT…
G
I don't understand why we all think they haven't figured AI out yet. If Darpa is…
ytc_Ugwks6JHI…
G
Mam what's the impact of AI on Accounting and finance jobs like Accountant, Acco…
ytc_UgwWLCVUU…
G
This was a tricky interview. Yudkowsky had some interesting thoughts but failed …
ytc_UgxMLKfL5…
G
I had a dream last night my entire team was replaced by AI and we were just sitt…
ytc_UgyCYeNxq…
G
I hate to say it but the market has become more important to network, then build…
rdc_n96apnx
G
No, Sam (and host), aviation safety happened because of ACCIDENTS. The record of…
ytc_UgyCFewna…
Comment
We might not fully understand AI, and why it arrives at certain results, but we still know that it has no true free choice. It is fully determined by the parameters it is given (you change a parameter, the answer will change). I don’t think something so determined can be sentient, or at least attend a level of consciousness close to ours. However if you believe humans also lack free will, then the conversation becomes a lot more interesting.
Also, AI simply mimics understanding without truly understanding. It doesn’t know what love, happiness, colour, or saltiness feel like. It’s just a machine running patterns. A calculator with extra steps. You wouldn’t call a calculator conscious because it can run complex operations.
youtube
AI Moral Status
2025-07-16T20:5…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzMpTd4mp8mVHtXhNF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjljCzqDFmDPk_qZ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEGByq8N7HEbWDIBd4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyxzMrD42toClm9sMx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxZgWPH0TxwXQFP0Eh4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwAm8CsC8cXQ-llUDB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOpgVPI5l6bGTq-9t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz41BUuqPlWgUg3Lw54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxRuO7oIKf1VlXlPQl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKfCtS3PIQYLpBOe94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]