Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bad AI art is slop. But bad traditional art is also slop. The tool isn’t the pro…
ytc_UgwhgGLrh…
G
Come on now... AI can respond to anything we might send its way, but it will onl…
ytc_UgxJLdqC8…
G
Nah, I'm not paying some random artist when ai can just do it for free and proba…
ytr_Ugyo9Dsrl…
G
?? >>> What's YOUR Experience with trash/scam Quality of the adverts pushed on Y…
ytc_UgxIf2wry…
G
Why not just write the email yourself? You wasted all that time trying to get ch…
rdc_n0hh9lz
G
TDS on full display
Apart from that, this was very interesting and gave me a lo…
ytc_UgyYLN8px…
G
Do any of you guys remember I Robot? Yeah, I'm not looking forward to this.…
ytc_Ugwe-WV2Z…
G
Two robot debating about the future of humanity... Well that sounds more intelli…
ytc_UgzMJY-rU…
Comment
Everyone says “AI will end humanity” like they’re quoting a movie script. What’s the real proposal—delete all AI and live in 1995 again because cassette tapes felt safe? Meanwhile, AI is already speeding up medical research and helping tackle diseases humans haven’t cracked in centuries. Boohoo, right?
And the fear stories keep growing: first AI “ends humanity,” then what—it’s coming for the dolphins too? Or it somehow neutralizes every army so no one resists? Come on.
We can’t stay in 2025 forever, unless we’re time travelers. The future is as inevitable as the earth turning around the sun. The question isn’t how to rewind—it’s how to guide it forward wisely.
youtube
AI Governance
2025-08-30T10:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzP6X-FYa3kgrxYQpJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0QbsUMJRC4FIMERF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxbL5eUHBYusEUzFmR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxB2cAQca72pSgMXnp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyCt9fLlQiBmACYkLl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzYN83y61HHAgdkAjt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwC60-Potw0QM7tfbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzSRD-b3QmMZoyQBLJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwH5VUUcKyQs9T4_KR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxNmjga-InPblH7uPF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]