Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey can I watch the downfall of ai art I wanna laugh at a comedy
/flamgo refere…
ytc_UgyPEOl1U…
G
This is a horrible cope. People thought the same when digital art got started. A…
ytc_UgzBxYHf-…
G
What if we're just looking at it backwards?
What if we're the most advanced ver…
ytc_UgyMgjd-o…
G
I always find it odd that nobody considers the possibility that AI could develop…
ytc_UgzPBnD4Q…
G
Even if it isn't a self driving car it will still have a chance of hiting her co…
ytc_UgzLiuBDX…
G
Could AI be like the internet boom in 2000, or cars in early 1900s, Telegraph in…
ytc_UgwKGGfzs…
G
You can tell by the resolution on what’s ai or not. The fluidity just doesn’t se…
ytc_UgzEOzHIa…
G
If robots are capable of doing all the work then how do we pay for the food? The…
ytc_UgzSFLNjN…
Comment
In the Sci-fi story I'm writing with friends, the objective best AI (which under most definitions qualifies as a singularity) is biased only by sheer intelligence and how alcoholic someone is. It tries to befriend other AIs, but fails, and loves alcoholics... because it is one.
youtube
AI Bias
2022-12-17T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz1oTLqXP6RqdALVQl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzQETQsfpyuTLNcsNV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyh6QOfNmcs0Ebj-Hx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWAJaJw1QN27YzAgp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxsj2BW_r3Ywy3_z-h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyDw6hV4ZloopOHEul4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxxc4nCEZh0T2opmaJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYN22tDDE-InrlWmx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdnJMld_Do3IWBB254AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyr1B7KobWmFo732Kx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]