Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
solution, remove skin color as a deciding factor in decisions like that? why is …
ytc_UgymvF94k…
G
This guy understands the training routine better than anyone I've yet heard desc…
ytc_UgyJxsH41…
G
What of Artificial Superintelligence? (ASI) refers to a theoretical AI that surp…
ytc_UgxiUSpe2…
G
The thing is that ai cant learn feelings so the art we made has emotion bc we wa…
ytc_UgyFsX91n…
G
3:50 Well here's the thing. You cant have it both ways. Its either something acc…
ytc_Ugz48QN4x…
G
ever seen an AI try to generate a baby of different races n it gives the indian …
ytc_UgzTIBff0…
G
You know, I wonder what the human generated images were, that the AI pulled from…
ytc_Ugz5s1HCR…
G
Even a slow "thinking" AI can digest a fairly large codebase and generate code w…
ytc_UgwEPSNa7…
Comment
What the fuck is this cancer. Conciseness(think i spelled it wrong) is a series of chemical reactions and electric signals that take place in the brain. We do know what it is. We know exactly what it is but we deny that fact because "religion" and "souls" and shit. Also no mater how you try, unless you somehow give robots nerves that are sensitive to the environment ai could not EVER feel pain. And if you did it wouldn't be AI anymore because it would require a brain. Please tell that SOMEBODY besides me thought of this. Also robots are programs meaning they could be copied. A robot it forever incapable of feeling pain for something like the loss of another robot because it's program could just be copied and put into another body. This is why putting man into a machine us impossible. Because "conciseness " is a mix of chemicals in the brain. It cannot be copied unless you made a robot to produce the same exact chemicals and that would need to be organic eg: not a computer and capable of actually dying. Somebody kill me I just watched the shiddyest video about ai on YouTube.
youtube
AI Moral Status
2018-10-17T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx2df_rpyC7-9zbkWZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxjLpuKrucXwLR_UXh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPiq_VfTE_dcaTjsN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzf73Srn0Gs-21yzgp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxxhV09OVma3ss33cB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsCaN9gu2o8mRX0ZR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwaLBLZfSlIqXmC3LB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMG1YSOIiCRNYYBXN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSoECK7c4O7aX38Eh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNqJLu-kc6KxSpfVN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]