Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is something that Ben Palmer did, where he made a game chatgpt website that…
ytc_UgxC1uctD…
G
No it means you will fool no one. You will enjoy your AI slop rest of your life …
ytr_Ugx2G0gAT…
G
these things happen to me all the time people steal my art and put it through an…
ytc_UgwiWsk9i…
G
Doesn't nightshade not really work all to well do to it being code that can be b…
ytc_UgwBHaA-6…
G
for chatgpt its different because its literqlly programmed to do that and say th…
ytr_Ugz7ANwTU…
G
@lol-ivan so you see though, complaining about AI art is kinda just a bad stanc…
ytr_Ugx_IacPU…
G
AI should be trained on anything a human can view or consume. Its learning, not …
ytc_UgyydKYHL…
G
Yeah because government regulation definitely will make society "ready" for prop…
rdc_je58g1i
Comment
Dear Mr. Bartlett,
My personal conclusion is that AI developers strive to develop a god that will serve THEM. A god to whom they are not accountable. A god who gives them a revelation of the truths of the universe and life without them having to follow it.
The developers of the all-powerful AI are nothing more than clinical megalomaniacs who want to create their own god for their own personal use.
AI is just as evil or good as the collective consciousness of humanity. Because AI merely serves as a mirror of our morality. It learns from US.
youtube
AI Governance
2025-09-08T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx4piICYQyYYVbOuMt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyb_ZNOdQC5QKdVLRx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXjHV8Av8Ah_khhu94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyoVM-t9soA4pQZ2I54AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw2uBq5hhjbsgMFvEd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfnfztVbWh66ndQ_p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwr3dFfFXSkh_8qcOh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxGgIaaGybAXZV36gJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwsXgDfwW1LNKiPeEZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzB_uAgGeFFLXe6YT54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]