Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The narrative of this content is deeply engaging. A similar book I read opened u…
ytc_Ugx-i_PuL…
G
@Ros7698 Privileged in terms of cash no, no argument there. Privileged in the se…
ytr_Ugy87CcSH…
G
Can't you just output a confident score on each statement outputted from the LLM…
ytc_Ugw2WnEQp…
G
Is it really the AI's fault, or the person who made the AI ,and or the person wh…
ytc_UgxIa4lFz…
G
Removing AI copyright protections and making iron clad laws to clearly identify …
ytc_UgwobigPm…
G
Pal I'm not sure what world you live in where people are commodities, their jobs…
ytr_UgydLqvxV…
G
I find it interesting that Grok started out as not censored. And now over the la…
ytc_UgzN7Rrwx…
G
If an OC was made by AI (or by a slop-drone/AI bro), destroy the AI counterpart …
ytc_UgyQ34MFH…
Comment
We shouldn't confuse the map with the territory. LLMs work great as tools, but they manipulate syntax without grasping semantics (qualia). They act as 'philosophical zombies.' Regarding consciousness, the microtubule theory presents a compelling case for a biological, quantum basis for thought—potentially acting as an emitter/receiver system. If consciousness is indeed non-computational as Penrose suggests, then no amount of code on a classic computer will ever spark real sentience. The investigation is far from over.
youtube
AI Moral Status
2026-01-25T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzt4vzUe87c8kXaI3d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw4-x4x-aIt1LBr_iZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxIhrI0RAW9yAfPHLR4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzE2Qe0nnlg6CQBPbp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzoIsxjnL_YtA9unxZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxMOQFi5Y7FNLCY3bV4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw9XO4f26mnmR-9zVd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwdp8M6nRP896XUgQV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzyWj4WYZqAx5whT4F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxfYDREnEpo63gI6IN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]