Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think AI art sucks for multiple reasons too, but low key I really liked that f…
ytc_Ugx6js1rN…
G
Hard to say. 鼓浪屿 is a pretty fun day. It’s the island off of Xiamen that is also…
rdc_fjzazkh
G
Imagine when AI is going to rule the world they about to say racist stuff…
ytc_UgwDTu50o…
G
You are paid on usefulness. Not on knowledge or skill outside of the usefulness…
ytc_Ugw_ATcFA…
G
"I know this robot cant even feel me hitting it but im gonna go for it!!"
This …
ytc_UgxgU0rlU…
G
ME: My grandma passed away last year, and I can't really get over it. I miss her…
ytc_UgxTr6OUy…
G
This women clearly did not understand the email she got, as she defends AI stupi…
ytc_UgxKc6Sfo…
G
Legal argument aside. Ai art is the death of human excellence and uniqueness.
…
ytc_UgyVEoaQK…
Comment
However, this is a commentary/opinion piece, not a definitive declaration from Nature itself or a broad scientific consensus. Many experts still disagree, viewing true AGI as requiring more autonomous agency, real-world embodiment, continuous learning without constant human data crutches, or robustness beyond current LLM limitations. The debate remains very much alive—it's just gotten louder.
youtube
2026-02-11T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx0D0BqJIoJtPN-bnR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzc-5WPzZ2MsuWxCwx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy66fS1HNyCAt1z7IJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzNDYe2N9T01_JR3nx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxW3CtlfcG03TqcNjl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPB46nHqsrKhz9Exp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwz7iNk2pAlvbEqOH94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyDvYV5JmdWL8oLdJZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwWZqpHvcU6aCVM3zN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyDyfj2iqMlQclHBDd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]