Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI predicted a black man to get shot based on a number of factors, including…
ytr_Ugz977GLR…
G
Kids movies should not be AI generated. It's very sad that children's media is t…
ytr_UgyOE6pOl…
G
Put a gun to my head and tell me to use ai when I draw and I’m pulling the trigg…
ytc_UgwmZG_HI…
G
Elon musk: ai is more dangerous than nukes
Also elon musk: *makes even more robo…
ytc_UgyJcibsN…
G
As long as the AI does not take the context into consideration, you are right, @…
ytc_UgyW5Ntex…
G
The almost universal lack he talks about starting around 52:40 seems very likely…
ytc_UgysNgA4N…
G
Solving climate change? :D what caused climate change? what is making climate…
ytc_UgyEbmp69…
G
People said the same thing about the industrial revolution. They were right abou…
rdc_dt9j12d
Comment
I was reading a thread about a specific profession (won't say which one), and the original posting questioned the ethics of the profession. The top response can be summarized this way: we aren't a charity. All business is about making money. The contract is clear (they're dense legalese normal people can't read).
Let that sink in. While true in some sense, you can choose to make money in ways that don't in any way deceive, cheat, or steal. In this case, the profession is intimately involved in healthcare and healthcare decisions. They can pick winners and losers at life.
And they focused on the profit motive. Wow.
That's why we have to regulate AI.
youtube
2025-06-25T19:2…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzbHIdTD6-KI-NniYN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYJTDUSDzphhbwdph4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw3RUB16jefQPgbDVd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzJqDgokwdDZnzEwtB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxn0Y9tppUd0M3iyGx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw7nQ-6xlqWze2eGm54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzcpSpbVJpPXKxBVfZ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyaSUiPjYIPigOe3o54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugz7QSGSuaDq7ArotC14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugym0_-zy8IihgbmzAN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"})