Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ChatGPT: Hey, are you hungry, would you like some recommendations on local resta…
ytc_UgxaCfzCt…
G
But what if even we are programmed by nature and all the emotions we feel ate ju…
ytc_UgwFlOknd…
G
@Morttozin because for the AI to work it needs a bank of drawings first. And tha…
ytr_UgzdWoEwe…
G
No body sees the face is a digital fake? Come on people. Don’t believe everythin…
ytc_UgzJh58hb…
G
An AI that understands the concept of compassion going on a rampage is wild 😂…
ytc_UgxC4rm0S…
G
First off stop... There is NEVER, such a thing called an " Ai-artist" the person…
ytc_Ugyb-KWa3…
G
As an artist, I like to use it for inspiration, but using AI to make art that yo…
ytc_Ugz9R_c9t…
G
Old talk of Ford with some trade-union boss: Ford created fully automated produc…
ytc_Ugx0mqsGG…
Comment
It's amazing how this group can flirt with the utmost edges of philosophy and existentialism without any reference to an objective standard by which we derive human ontology. Without such a standard, philosophically speaking, there is absolutely nothing different between us and the rocks that we trample on, let alone AI that apes humanity. Without us as Image-Bearers of the Divine, we don't get to view ourselves as the sole bearer of rights, since such a viewpoint is as arbitrary as proclaiming your favorite ice cream flavor as the objectively best flavor. A standard is needed, and a conversation void of such a standard is philosophically baseless.
I appreciate the news on AI and the technical discussions - these facets are superb! The philosophy is abysmal, however.
youtube
2026-02-07T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwoXsJ8CyjpeEBxVzx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw2BeeWtYDTXDgD6jl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzJy525o4uk1w82cuN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgznHUSheQH6F3n7Ax14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxFFabcKC_5Z6HbKD94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz8hfacgTX1MD5xG-J4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwsEEpJ8IufH9nmqW94AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzlzf_pNsr_xM91t7d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzsxJyXfmmOllUhnDB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw2D9w2kKvEuv8D39p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]