Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In the future u will have to rent a robot to go work so u don't and your going t…
ytc_UgzWIad-s…
G
AI art is just an efficient way of making animation series. There's no talent in…
ytc_UgyzMS9-p…
G
Yeah really optimistic when the threat of losing your job to a robot is here.…
ytc_Ugz4qidbZ…
G
I think the real risk is letting people use AI at all. Oddly enough, the corpora…
ytc_Ugw6uFmCw…
G
Obviusly if the question is, it it ai or human they wont give you a picture of a…
ytr_UgyPOoCYx…
G
Let me guess, this was because of Shadiversity's video on AI, isn't it?
Not…
ytc_UgwPTQnj5…
G
This just goes to show AI has no idea. The grammar is far too accurate and the d…
rdc_l9vzwh8
G
The tech oligarchy put Trump in power because he has no interest in regulating a…
ytc_UgwspcHfr…
Comment
Eh. I’d really, really like it if all those intellectuals would care about the actual, demonstrable harm AI is doing RIGHT NOW rather then spend any energy being concerned about SkyNet.
We’ve barely scratched the surface when it comes to legal protection for revenge porn, and in so many cases just depends on child pornography laws to punish teenage offenders.
We’re now well into the era where this is going to be turned up to 11 with what current tech can do on the Deep Fake front, and there are just so many people that seem to care a lot more about a Terminator knocking on their door that pushing for legal protections for women and girls that are being harmed TODAY.
youtube
AI Moral Status
2025-10-30T22:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzKGFzOfu7bIUe0T1F4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxD6SmwAXJAjIrvHk94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8aRpaCBFPGZngOlN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz-cqGQwQgMUS64zjd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwlJvJOJh0vhdqJiOd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzzJxdB032j4nXLdjp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzcvYv9pPqb6vLCPyJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzAgdTksNK21719Z7V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwGH9cCpJAkbZd1AwR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzpaLDu33gCqYXM7pJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]