Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
12:00 section is completely stupid from your side. "Hey, guys let's compare pist…
ytc_UgxXOGW8Z…
G
This is very true. I am an engineer and we use AI. It does a very good job with …
ytr_Ugyn3ZBSc…
G
Contrary to public perception, hacking is not the major way to steal the techs. …
rdc_gtwwlch
G
@zombie_w33dYou ain’t even finished the video because the argueme…
ytr_Ugz4wi8Y_…
G
I've been drawing since i was very young, started digital art in 2020, and im cu…
ytc_UgxOHEORt…
G
AI in fiction: *Goes rogue and enslaves/kills humanity*
AI in reality: "Despite …
ytc_Ugx7nRJlg…
G
Theres a reason the chosen people have their own schools, and why Public schools…
ytc_Ugzfd9Wo5…
G
Robot: We did it!
Accidentally points and activates trigger on human due to exci…
ytc_Ugym3Ywtj…
Comment
Regarding the people that want to use AI rather than learn, this is nothing new and has existed in various forms as new media arise. For example, 15 years ago I was a team lead at at a startup. The younger and more junior of the crew all wanted to know what they needed to do to get promoted. I gave them a list of technologies to learn, with links to free resources online. Of over 30 people, only one took up the challenge and learned. The rest just complained that they didn't need to actually know these things to solve problems, but could just 'look them up' as needed to copy them then forget them. The problem with this approach is that they never knew when what they were copying was correct, and if it was correct for their use, ...and they still know nothing.
youtube
Viral AI Reaction
2024-10-23T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzxj_8fiym0RKduOHJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyU0CfNRxKa7moV6gV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"disapproval"},
{"id":"ytc_UgyprSaaaXriJqPZIVx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxKffCCAJnxAajFqSF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyrD4M-Hsz98YF6ryN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwDyzUmPImuYISDuUZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz-RY5NVyA3qdIwNKJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwK5uebkeTJA_eVjFt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgykIryUzTSebtlQi4V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwVm9BWirju30i6FC94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]