Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
but the human error in flying the drones was fun, whereas the algorithms where p…
ytc_Ugz_hEUCG…
G
the water thing is just so dumb. Datacentres don't drink water. They just use wa…
ytc_UgwrnGHGY…
G
If there is such a high chance we're living in a simulation and this is all an i…
ytc_UgzAcCDMS…
G
Jeff
Yeah, he definitely has to live with that...especially because he could pos…
ytr_UgzQOZ0xe…
G
And people wondered why I chose to homeschool my kids. This alone demonstrates h…
ytc_UgzvtDxlh…
G
I remember being a 10 year old kid in 2015 exploring internet for the fist time …
ytc_UgxM9CUVV…
G
It really depends on the data you train AI models. If your data is biased, then …
ytc_Ugwsc4Nyk…
G
A computer has no understanding of death. The closest is if it loses power, but …
ytc_UgyJByM7r…
Comment
They should only have it to where sone artist if they consented to being a part of the ai art they can and they will get paid every time others generate thier art, that would have been best solution over just getting thier art for free, the ai art generater should be having pay a month for artist used to the generator, if thier so hellbent on it. Its not even thier art who is those people??
youtube
Viral AI Reaction
2026-01-26T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzD8Kp_AGtznMfB0PZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugztm4eCMBwnyeSTPpZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugx0nwYEjoA6VEDQ-Pp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyEhNM1vhLlxzqJ4Ot4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9-ThJVc4LtJJJ7tl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzuB1QaibTWg1RXqlp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzFURG8gQwmHkYCIQJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIrVBlO2nWG_JHkXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlwCIiGpXqTvxHXfV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyvh2FGz0l_0SpHPXh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}
]