Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I tried janitor ai and they always grape me even if i say am 13💀…
ytc_UgyFrgE1h…
G
We should take care with the words we use to describe AI: It doesn't have an int…
ytc_UgwIsaP5b…
G
@Mhichil weve got hundreds of years of economic data (just look at the disparit…
ytr_UgxiI673b…
G
Buried the lead that he was on anti depressants. Those caused his suicidal more …
ytc_UgwAZ4tG8…
G
AI will be able to do all the art we can do. Give it time…
ytc_UgwQVouWj…
G
Where I go shopping it is mostly automated. You scan your own bar codes, place y…
rdc_cymljja
G
Wouldn't the parents be able to tell the AI generated images are fake? Like they…
ytc_UgwdQBo1S…
G
Comparing gpt 4 to gpt 5 is unfair. You should compare it to the last model of o…
ytc_UgwUSoZ6Y…
Comment
I don't mean this in a negative way, since artwork should never be stolen, but (and this question is directed at people who know more about AI than me), is it possible that AI used in the medical field could be affected by poisoned images?
I'm assuming that, rather than training a whole new model from scratch, they would use a preexisting model, so is there a danger that poisoned art could mess up AI that does important tasks like surgeries rather than just image generation like it is intended to?
If yes, is there an alternative to this to prevent AI models from stealing art without damaging them?
youtube
Viral AI Reaction
2025-01-23T20:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzbB-jmHGGn9bXhntx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgykIe5HnRHzJuykYAV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyip1bsVlXEmlaL1VV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgydGtLI7wibhqWBsud4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7o8LVd6gm_IF12eJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx9YJcBDHgBckZzuKJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzpcv4ODb-wl6854HB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxECCn4eGsMHsEYBOR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHDjfB5qNyQpe-KnN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyapt-3BORuZEx02QV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]