Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That is the name of the new geologic era that we have been in since the dropping…
rdc_degksaq
G
Is society’s link between self worth and work (for a company) mere conditioning?…
ytc_UgxDNNX3V…
G
what if they are so aware of the stigma to AI they are just fucking with us 😂…
ytc_UgyUxzeQB…
G
Hi derek,I couldnt help but say that your physics vdos used to make my day earli…
ytc_UgwIm_dRZ…
G
Yea i watched it too and it was definitely absolutely disgusting how the investi…
ytr_Ugx1WpLju…
G
The news really spun this guy to be a lot more certain about the sentience of th…
ytc_UgyYPpD1b…
G
They found that 7% of ai try to keep them self on and we kill them i think those…
ytc_Ugx24DJb9…
G
If said self proclaimed artists cared about anything but the potential impact on…
ytr_UgzBat3OH…
Comment
I’m no expert or any training how to make these things, but could the hallucination be required to be intelligent because the very idea of extrapolating ideas from something that wasn’t there before and putting everything together to create a positive answer couldn’t that be just a controlled hallucination to gather the idea, ideas and thoughts in its head so the hallucination is a requirement in a byproduct so sort of like how humans are specifically a species and other species like dogs and close to human intelligent animals all can suffer from certain aspects of mental health issues, hallucinations other things like that, but it doesn’t seem to propagate in more basic level animals that run on instinct alone so maybe as the AI models get smarter the hallucinations will get worse and you’ll have to make an AI type of SSRI where you have a extra piece of code bolted on the out externally going in interrupting it, causing it to be like a mental health person that think that could be something that makes sense but who am I just an asshole on the Internet lol to be honest the AI is fucking scary
youtube
AI Moral Status
2025-12-28T14:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwpwBOWOmLgbsOayF54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxq8Y-4q3fLwoSV8fF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMYb13vHTunykxvQ14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx4V5NrdfCcwgNqUW14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzSQVfrVyfYRFT4MlF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxlZOmgi4pBmfZAuyl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwN_bwUYA09BnuXkEF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxW4hv-4Yow_dcfszV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqSnSByKi6NgmK5gF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwggXUS569L5ywWE8t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]