Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is not going to do anything that will jeopardize the pharmaceutical industry,…
ytc_UgxAtKRGt…
G
There was a court case where a man rightfully fired an employee. The employee su…
ytc_UgzOgiMdE…
G
What this does tell me is we’re fucked with ai. Mow that Trump has approved Grok…
rdc_ngsp84q
G
Also, if you want to live, than don't step on the road with black clothing witho…
ytr_Ugxr6jGjT…
G
no piece of ai "art" is good it only looks like that because of the real artist…
ytr_Ugzoxuikd…
G
It is often impossible to teach the fools and the arrogant to learn how to accep…
ytc_Ugywmh-lu…
G
I'm much more interested in sabotaging AI's effectiveness by teaching it wrong a…
ytc_UgxE6E_W1…
G
This should be available for every child even if in public schools u got a 3 hr …
ytc_Ugx-Svf98…
Comment
The most fundamental flaw in the way we speak about AI, is that we personify a giant machine. It doesn’t “care”, “think”, “realize” . Those are features of awareness. AI just regurgitates info in different possible ways. It doesn’t think at all, words and information are just getting paired with other words and concepts and then sent back to us. We then assume that these words have an awareness behind them, as they have in the past. But AI is just taking the stuff we already have made for it, and automating it back to us. We gotta stop personifying these current machines. I think people start to assume they’re alive b/c the way we talk about them.
youtube
AI Moral Status
2025-11-11T20:1…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzghfQgc-kL-moNcSF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy65zJCPbZ3r2clBCp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyxxGcaDkKJ6W2bZg54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxwsmEf26HmJNgj__B4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwRhZ1yni1izKOFeGN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzIoE_A82e5v51vC3p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgykAr0gDS-f9obdXpJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx9z5YYWhqcdlZEbo54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx_SZxuksVYwuQQeV54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyI6fQ_tiDF-EVazyB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]