Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We appreciate your interest in AI technology. However, it's important to remembe…
ytr_UgxrVlfiR…
G
100%, I promoted my disliking AI to hating AI a while ago. Anyone who idolizing …
ytc_UgwVyqTZQ…
G
And even when you ask it to judge, it's pretty reasonable in its assessment. And…
ytr_UgxNlhT7e…
G
Get it Bernie, the only person I've heard come up with sensible solutions to the…
ytc_Ugynj1JWs…
G
People are literally just doing this out of spite at this point. If it looks goo…
ytc_UgyPAdw7H…
G
In the long run, this is good technology. The problem is we have to get rid of c…
ytc_UgzMxnA_S…
G
AI hype is hilarious - guys, its just a model that is modeling on top of data. N…
ytc_Ugzw28OxN…
G
Humans adapt, that’s why we are the preeminent species on the planet
AI will h…
ytc_UgxBK3p0q…
Comment
Bernie doesn’t know what he’s talking about. He’s falling for the AI hype like all the other shmucks. A widely reported 2025 MIT study, which found that 95% of corporate generative AI pilot projects had failed to deliver a significant return on investment.
youtube
AI Jobs
2025-10-08T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzQ3GVWLkthq-l423Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4hWxLsIWQ2B409Dx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_dRq6t5HSF_nNTuR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFP4PKPvb4TrFxT3h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxTF0niclodY3NraRN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyvdiArBj556dbCxIh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzywyOLmbHK7PXtliJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw628sKWFLo9agK6Yd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwcE2Ajw0paAbnydqN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxzSbHrgTItf2oWwjd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]