Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This time she hasn't convinced me. "If you are working with a character, and you…
ytc_Ugzxbaz6M…
G
This is ridiculous and it breaks off from reality in a very simple and early fun…
ytc_Ugyz0oOq5…
G
corporations will do whatever it takes to save themselves money, AI is just anot…
ytr_Ugxf5XGyb…
G
You gotta think about all those Phone operator assistants that are now just AI j…
ytc_UgxE1KwiP…
G
@LanaW123 IT LITERALLY SAID FACIAL RECOGNITION IS RACIALLY BIASED BECAUSE MOST A…
ytr_UgyKYK_Eb…
G
Thats actually a good analogy, hahah.
As a programmer i like watching the conver…
ytr_Ugz1oCmyU…
G
Me making fun of them:
“ I saw your character ai chats..”
UH IM LATE FOR FISH C…
ytc_UgydQmyZA…
G
If the rich can't get richer because the lower classes do not make money because…
ytc_Ugz2zaGom…
Comment
I still believe that entropy assures that AI left to itself will degrade, not improve. Energy is the limiting factor and it is not increasing at a fast enough rate to allow fast progress. Imagine if a CME event from the sun hits a world controlled by silicon.
youtube
AI Governance
2025-09-07T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwOT7V2zvH_lx_FRt94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx8ppd2txmq1UFSg4t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfipyxUulla14_6EN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy7GsC3Ip8SrpTLIuZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzy6XqzRzBZGDI9Vn14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy_O4vlJcpEhG60e-t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwleDxkZcPPXfTB0N4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwrBcSTIVYb9bmJom94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxQsC0Zq-h1kOao_nN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyNGpS2x5lafCQ2MF94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]