Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wrong, people have a right to create derivative works and you would have to pass…
ytr_UgxC-8EUb…
G
As someone who does AI "art" for fun, that is all I see it as. I do respect real…
ytc_Ugzl2V8aL…
G
The issue with FSD is not with Technology. It is the overselling of it. You cann…
ytc_UgyfqxdC6…
G
how can companies expect anyone to pay to use AI when they never paid to use You…
ytc_UgyQCD7JI…
G
@Dry-aged-millenial I write screenplay and plan the plot the character developme…
ytr_UgwyYMCux…
G
AI will never be able to fix a car or could it? I am a auto Technician am I onto…
ytc_Ugy40-q5M…
G
Robot at 0:01 : "what is my purpose?"
Me: "you pass butter"
Robot: "Oh.. my god"…
ytc_Ugi9SRakL…
G
AI will write subroutines, functions and small system programs, however human pr…
ytc_UgwPhaWLT…
Comment
There are a lot of things wrong with this, but one thing I've been learning about recently is AI's propensity for hallucinating/lying/making up answers to things with complete confidence, and only backing down after specific pushback from the user. This could be disastrous if used for teaching. As if the current educational system wasn't in a bad enough state.
youtube
2026-03-30T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwxbjbwY-Y1S09F5RF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy564-Ejk34vxpLhbB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzskqYVofSqp5CLI954AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzasV4jYRtEIN2II8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxKMC9ugezNV9gbWwp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyxE8fWkyphqaTN_7x4AaABAg","responsibility":"user","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzzozLXfK1OuPiPrt94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwykM52DIwHyRsEKD54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyoqqHUMvLCQyKuuBp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxO9HZ_G2d9oY9hY_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]