Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was gonna say this shit isn't realistic at all, but then I looked at some Inst…
ytc_UgwwFsBS-…
G
Yeah I am just a ranting *MULE* that wanted to actually see a robot sincerely sa…
ytr_UgjeHZ3et…
G
I don’t know why people are using and defending AI art.its bad and there just be…
ytc_UgwAzL5is…
G
AI is a lie , what part of the word artificial is hard to understand !…
ytc_Ugz4hd9iC…
G
Dont make them make art make the ai pay taxes exactly what @touchtone. Is saying…
ytc_Ugx0wvxky…
G
There are very few courteous,cautious,considerate CDL drivers on the road. At le…
ytc_UgwRLi69X…
G
The crazy thing is that for all we know AI could have already had a glimpse of t…
ytc_UgyFsyGOh…
G
Interesting
... How can we KNOW, for certain, whether this is indeed an actual …
ytc_UgwcIePda…
Comment
LLM's are still silly machines (who still did end up taking jobs). But they are not intelligent, at all. They are just extremely complex pattern-matching machines. They actually don't understand the way we do.
AGI as a concept would really be scary, however. And its creators have a purpose... To replace human work.... They are bad people.
However, even If AGI were possible (we don't know yet If it really is possible), it would take some massive technology breakthroughs, which are unlikely to come for decades.
There's still time to prevent a massive crisis and maybe stop AGI.
youtube
Viral AI Reaction
2025-12-04T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz5zQx-mkvYhKeWhOx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxhFoZPzOriT89J_4l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgydIRenESaLXJ6QGp94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwm-IbdD0xWwx2Lhuh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyZUNH2dT_uxlL7LOh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxhYD9Ipart5HSUi8J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxZlGZz21xv-MAmPsZ4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtYv1H6fsPVNlEzvl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx7TiHomK5PMU3Xj_94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDAxKWPqdcLtVaL-l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]