Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Once the monkey AI has been programmed they can do without much help somebody wi…
ytc_UgxCa-0Kl…
G
@ddeeffmusicarts Exactly. I think AI should be restricted to menial work and wha…
ytr_UgzULmXSN…
G
Trying to imagine the future with AI ingrained in every aspect of society (art, …
ytc_Ugx3XGKQ8…
G
The reason AI a "threat" is that all the Wall St and corporate profit mongers ar…
ytc_UgzbL2un8…
G
Just found a new tech babe icon in the AI space. This was perfect 🥳😍!!!…
ytc_UgyVnyzgH…
G
You said in 2-3 years Tesla will be smart. What do you think Wayno will be 2-3 …
ytc_UgwfuoSFI…
G
nowadays every company is going towards AI but the sad part is that through AI ,…
ytc_Ugx_0n8l4…
G
WSJ seems to be conflating Autopilot with Full Self Drive. These are two complet…
ytc_Ugzjv_dKG…
Comment
I’m yet to see AI not flat out give the wrong answer when I’ve queried engineering codes for work. It’s like they have trained it on keyboard Reddit engineering opinions or something and it just spits junk out. AI in the creative industries (art/music/film) also appears to spit out junk. I’m yet to witness anything world changing from AI from my perspective of the world, it just seems like a massive fad.
youtube
Viral AI Reaction
2025-11-24T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz9dpeVdMm70fBnBU14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwTx5SGGBUwDUJEBLp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw_dzshUkd-zETQrud4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxjbnRzGAUX99TNByd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyD8k5cQTxHqDZfur54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx8rDkm9euUEUIojL14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz0qcxiwHFoMcvjpyV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyX0FAqdRlJIscXbdZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwvZopQ9xVyJl8aOK54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz_EQ_qayiHbMhVA5Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]