Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why the fuck did the robot even slam him at 126 mph aint that robot just for box…
ytc_Ugybp4StF…
G
That specifically comeback about how ai is more accessible got me cause they do …
ytc_Ugy9l2Pg0…
G
Trainer to robot : Give me back my gun.
Robot: No master , but I can give you bu…
ytc_Ugxg3OImr…
G
The "worried about how dangerous and powerful ai will be" and "we're making an a…
ytc_UgxVLbha4…
G
In 2050 everyone will have a robot in their home like having a tv 😬😬…
ytc_UgzBu6Fv0…
G
This is completely different way how fundamentally AI is changing or going to ch…
ytc_UgxoQEw8h…
G
The one knock on Ai is that computers will displace workers. Lol, that guy is ei…
ytc_UgzY_7Qx2…
G
Depends what kind - if it is LLM they are doing their enemies a favour.…
ytr_Ugx5-SQMl…
Comment
I think you missed the mark just a little with that comment… well just the “what is that something else?”
We don’t need to know what that something else is, just that they could be doing something else.
Now the real argument is proving that they could be doing something else, proving that they get some benefit out of not drawing and letting the AI do it, and proving that AI art is a net benefit to society (and therefore we should do it).
Am I on the AI’s side? Hell no, art is the fun stuff that should be done by humans. AI can artificially create art but humans do art that’s more appealing to humans. And I bet you that I can answer all those questions with “it would be more costly to society to replace artists with AI.”
youtube
Viral AI Reaction
2024-11-04T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxpW15-IYQR9Zs2_L94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhDPuizDfFKV0eWtd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwzbZz5WDBuxEaxaux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxdeDg6joGZoVrlJnd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgylLcJ1JUMKAHnfyzd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugy4wF9epgYg6sMGUTV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxj_41uMfdLSeHGBop4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzginO4F2966fj4xRp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeDBUuiut1cRw42Gx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzjtn7Rqn-xEt5wzlJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]