Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They should just improve things that need improving. We dont need AI. We need to…
ytc_UgzXX7uQO…
G
@hornfelstone4366 you definitely can with the right AI. Wonder AI and Novel AI c…
ytr_UgwQBWRoJ…
G
Scientist: Hmm.. this robot needs data to learn.. maybe i'll just connect it to …
ytc_UgzOusE3O…
G
I don't care if it's capital-A Art for pretentious snobs, the benefit of benerat…
ytc_UgxfzHOwJ…
G
AI must be stopped. This will benefit the evil in the world. All big tech alwa…
ytc_UgyoEpL0G…
G
Congratulations you are one of the pricks that have taken the term A.I. and ruin…
ytc_UgwbiGXS9…
G
So if the AI assigns to you an extremely high chance of being a criminal, would …
ytr_UgzBrJ91u…
G
I think AI, if it gets there ,will completely hide any motive from us while func…
ytc_UgzGABsKa…
Comment
A problem with AI art that isn't mentioned here is the long term effects of abandoning our creative process to machines. using our current AI technology (ie deep learning), we cannot create machines that can innovate. If we let the artists rot away and have machines do all the art, we will be stuck in today's art trends for the foreseeable future. God knows how many innovators, Van Gogh's and Mozart's of our time have given up on their dreams because of AI art. We are in the process of abandoning one of the few things that makes us human, this surely will have consequences far beyond our current understanding.
youtube
2024-12-14T19:5…
♥ 16
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyXJzN1bszE5bu2-Rd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyKPHa3t4-szDAhxHp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx2AoMspOgS8JGJyQh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgwHpJcgwF3ybamKGjt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTIfvQhsxjP44Cr7l4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxBifI8uD0ZSXmtZHl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyscCDLrIGHV52Q1RB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy039tWuBmzvlPiuIR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwZwavMVGrQ7Jmw1VN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwhPsIE2mzi-bo0DWV4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]