Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In a world where Trumpism is spreading worldwide? I pretty much welcome AI takin…
ytc_Ugw6QsfUP…
G
They could have just bought an AI off the shelf with the money they had and twee…
ytc_UgybEz87E…
G
I've said it over and over and my friends look at me like I have two heads. We a…
ytc_Ugxh_5yiE…
G
Hmmmmm, as the wife of someone getting his doctorate in data science (statistics…
ytc_UgwPS1hPv…
G
Hey alex , while you could not get chatgpt to admit that it is indeed conscious,…
ytc_UgwwnROZh…
G
@asynchronicity No, the point is fear mongering about things that we do not unde…
ytr_UgyC-ZaiZ…
G
I feel that AI is good for certain tasks, like low-mid level tech support maybe,…
ytc_UgyLE2X5o…
G
I THINK ARTIFICIAL INTELLIGENCE HAS AN ISSUE A INTELLIGENCE ISSUE THEN PERCEIVED…
ytc_UgyMmuUoh…
Comment
I absolutely agree with most of what is said here, and its a step in the right direcion. My only question and concern genuinely. Is after we fix the unethical side of HOW these datasets are trained.
Even if the newer offerings are trained in such a way, and the unethical ones magically erased. How does this fix the issue of unfair flooding of the markets, and competing with the multitudes of artists who rightfully chose to opt out? I don't really see how this will change anything, without extremely strong worker protections/laws being in place restricting AI usage anyway in these fields.
youtube
2025-05-18T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxuLFLlhKWiweGbVTh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx_44_zL4JHbsDhy994AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwivWjx2bmnk-ZpwId4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyEKXU5YlO1vUKAlZ94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxCQtNv0LlKrldXnn94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxvFATDM2Y2_zAexXZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx57yTR-73_4xXJWzB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxU2qFzhOEPREf6ldR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzi017ZR3y3t1zB4iN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwgoRESMzOOQ9ZZelt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]