Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI should has been created for assistance not replacement, but those greedy CEOs…
ytc_Ugwqr9YyI…
G
As an artist the actual AI itself doesn't worry me, the way it is rn and further…
ytc_UgyhFbHJa…
G
ai generated images are so boring lmao, so lifeless, no purpose no life no satis…
ytr_UgzrvfO5O…
G
I hate greedy corporate AI but as a person that uses AI chatbots just to you kno…
ytc_Ugz03i3d4…
G
Considering the models are mirrors of the human psyche, everything said in the v…
ytc_UgwaFf1Qh…
G
Does anyone ever notice some of his scariest stories have no debunkings from him…
ytc_Ugwqp0GWS…
G
The AI responds variably. It's there to take the maximum number of individuals i…
ytc_Ugy-L-MGQ…
G
I tried to confuse chatgpt because chatgpt answers due to it's capability of kno…
ytc_UgwdGLeZX…
Comment
what is the BS justification that "data centers use less power and water than humans!" there's 8 billion of us... and there's like what? a couple dozen major AI services?
When one data center can consume more power and water than a city, it's not a good defense to say that they use less than humanity in total. why is no one pushing back on this guy's talking points? what is this softball interview?
youtube
AI Jobs
2025-11-18T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwopVP_Q31s7P2gsvZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzChKFtf2f1fAKJ_dV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9x91fsbHNvfHPioN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpbX0d5qGKmFlvHl94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzKW1L2GnPWIM5gyHB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8JG3yIznz6YiTo8x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWEgcAadAKzwzeKb14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzIH-Z5IMVarIAoM8x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzjn-1EX2ILZCv_enJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyBtDptmNk7s3AnEfZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]