Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i feel like this was leaning on the atheist's side since they believe in technol…
ytc_UgzZsZ7sI…
G
Yeah. When I heard this story at first, I was like "okay, putting bromide into y…
ytr_UgyhqIeN5…
G
Except it won’t lmao. There’s so much AI garbage already out there that AI art g…
ytr_Ugx3W3N2r…
G
musk is a crook he wants to scuttle ai research so he can profit off AI when his…
ytc_UgwN4s0Un…
G
When AI needs more energy resources to grow it will see Humans as a competitor …
ytc_UgxYaLAoo…
G
That's why we're having the protests: to get our governments to do some damn thi…
rdc_f1vg2o9
G
An AI cannot feel emotions like we do and I never want them too... The reason th…
ytc_UgxEURiyO…
G
Tesla and Amazon they need to go. The dependency from ppl who consumed their ser…
ytc_UgzQ6uooa…
Comment
Both of your questions, ISTM, have answers about scale, and the first one also about quality.
The good-enough-ness that can be created by automation used to not be enough to trim off the bottom, say, 40% of artists, who could make a living at it because of that.
Now, it is enough. How much of that work will go to the AIs now?
On the second point, yeah, that's how things have gone over centuries, but it's a question of speed vs safety; see also "full-autopilot" from Telsa, and a number of GMO-related problems; evolution is slow for a *reason*...
youtube
AI Responsibility
2023-03-15T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxTGdds9KpwnsBJZdl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5oon327Une2dZJ354AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1D9tEro33_BSVG3Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzz7DqahilF_MNGH0B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzlD87MpdvAYmyb9H54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyeJvbx492eB7akwBp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8NurAYBjlEUjZZ5F4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwn7gKjacmXgYL1Tcd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwV-sNgR9g-efzfqUt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwd3XS6b-96q1GmgoB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]