Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Haha, indeed! Sophia does have that futuristic vibe. If you're into AI tech and …
ytr_UgxCRrZF5…
G
Wait till AI, turns Into SKY-NET. FROM THE TERMINATOR FRANCHISE MOVIE'S. The
GE…
ytc_UgxwxYEoQ…
G
This is complete rubbish! You need to understand liability. When a doctor messes…
ytc_Ugw8ZFGpa…
G
Companies will still profit off AI greatly even if we tax them to help fund UBI.…
ytr_UgwKkjeQ1…
G
Well as long as they continue to monetize everything as usual the human needs wi…
ytc_UgxcWHuQX…
G
He is making ai robot cause ones start can't be stop if it gets to a wrong group…
ytc_UgzYYqaUC…
G
Self driving cars are not ready to be released onto the public they are not read…
ytc_Ugxt_tvLT…
G
Yeah. I also think most people will prefer real people take care of them as nurs…
ytr_Ugw00k6lt…
Comment
100% of this is due to developer error and shortcuts. they use the cheapest models with 0 temperature setting, no evals, and one shot generation. Current LLMs are powerful enough to provide a lot of value to all of the business stuff mentioned. It’s just the fact that it takes effort to build a reliable system, like most things it’s not fool proof out of the box.
youtube
AI Responsibility
2025-09-30T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwa2Kmr-yDoZ4RZwpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEVvTlbZDP31vEjiJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdxTJcMlE44fYFh014AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLMvWr6EkdXfh55lZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwKLIoTIZupR-digid4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx99WGi7UEpXHaeDwF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzBnJ2XE4geSVJqZtF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyTFNb0SUHmJMG60nN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwoDM_JGyYAuiK3KWx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwHUJT_8ejOURstH2R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]