Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai Bros when the power runs out: on nyuu we can't draw! 😢
Traditional and Digit…
ytc_Ugyu51Dzp…
G
"You're just taking the light from the artists work with photoreceptors and your…
ytc_UgxzC9tYi…
G
Ai art can never replace real artist, cause it doesn't have a soul its just empt…
ytc_UgyS9BW1n…
G
AI in healthcare is scary enough, but imagine if we had structured workflows to …
ytc_UgydeD7fu…
G
Copyright your lyrics. You can license the composition rights to the lyrics to a…
ytc_Ugzv9VD8h…
G
Very grey topic. AI is poised to become a new tool for art, in the same way that…
ytc_UgzbvBBsk…
G
“We still flotin king” “Rest easy king”🙄
So why in the hell is AI taking like a …
ytc_UgwyDf4Di…
G
yes, because humans are actually sentient beings with brains, unlike a stupid ai…
ytr_UgwhqS_Yl…
Comment
It entirely depends on how complex and accurate the results need to be. If you're trying to get the LLM to give an accurate reponse for a complex task, do not say please, do no try to be nice. Chain of though as come a long way in LLMs, but as of today, the quality of your prompt, will still have a big impact on the quality of your response. The LLMs don't truly understand natural language, it's all about computing the next token (word) with the highest probability, so saying please and thank you isn't useful at all, if you want it to assume some role, like of a kind assistant, just tell it to do that.
youtube
AI Moral Status
2025-05-26T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxmUk6KFi58ORyJNut4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQDRMdgebmSHzPfVN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwihhZMQhL18fj4CcB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxz3ynfVb0w0yE1iVR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5wwYcyRS_ldDAKcR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPLIp6-728xUc4ADd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw30AT9q1EUEEh5m9F4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwDiah4Rqd602EpnOp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgymTCgdzRcNpepE7Fp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxs2LwyPj-YN9_hXc14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]