Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is the death of innovation for Meta. Innovation drives the next year profit…
rdc_m84ikjt
G
Mentally disabled somewhat artist here!(AuDHD)(i think it counts). Fuck ai. Far …
ytc_Ugym7O4aQ…
G
I’ve spent the last 6 months really learning about AI and what I’ve taken out of…
ytc_UgzDBQOKX…
G
but god created us humans. so he gave us the power to create this. and better th…
ytr_UgzTwfP3H…
G
I cannot understand this man. When asked if he could go back and not do what he …
ytc_UgxxkW5pO…
G
The "Robot" is a stand in fighter named Marcel Kharnov... And the other is a coc…
ytc_UgxirOnS9…
G
Exactly right! I had two different sets of Narcissists beating me down…both pare…
ytc_Ugx00NjpJ…
G
0/10 no arguments except 'artists are whining because ai is training off of thei…
ytr_UgxX-SZ5c…
Comment
Another problem is that on our current trajectory, we will not have a stable economy in 20-30 years which is necessary to have a stable supply chain. Due to environmental developments which will lead to major instabilities. In case we are able to make a transition to renewables this will become not also an energy crisis, but it will be a water crisis. In the meantime AI will ruin job markets and more unemployed people are no longer able to buy services. This will hurt the economy making the bubble economy more relevant. However, bubbles require exponentially more money and in turn exponentially more return. This is never given. In that it will collapse at one point. Maybe AI can then figure out a better economy model without people or AIs having supreme power over others, but I doubt that.
youtube
AI Moral Status
2026-01-08T14:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxXvP06xB_rvHXU8nl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxB2lUMC10V2WCKMdh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzG6m5nNk-ZQp4yPdd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdLgUpm0zqRww_36x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXKB0Q9EOyb0TYAQ54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz9jWegCqJ5MLH9GXF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxjHKweqa7s6ZC0JHB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy-KZ4-7G2BKQOny894AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy88yz9_C5B-z5vALJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-G5YAEcxVcUZLiZt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]