Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People talking about how it’s dangerous self driving cars but to me if it happen…
ytc_Ugz-OOC7D…
G
Lets assumed ai chip is not embedded in the human brain in 100 years from now. A…
ytc_UgyYzYiRH…
G
What he's said at the beginning is 💯. If we don't lead with AI an advisory will…
ytc_UgyxnBDIo…
G
Thanks for the comment, @Corkyjordan89! I'm glad you enjoyed the video. By the w…
ytr_UgwKbhvg7…
G
it's not possible for a.i. to be conscious. it can only ever run code. processin…
ytc_UgzIKUC96…
G
If it was true ai art then yes that’s fine. But this is more of algorithm art. T…
ytc_UgxRVHqkE…
G
>Has anybody else had to deal with near total blackout of AI tooling?
Yes. S…
rdc_l56ug2t
G
I'm sorta confused by all this. Like do people not know that AI, like Grok, and …
ytc_UgxNyV_GJ…
Comment
“They’re taking your jobs.” Who? China? Central Americans? Or the Politicians and CEO’s who are offshoring (and enabling it in the politicians stance) those jobs? Seems like AI is taking our jobs with American CEO’s at the forefront of it. Make it make sense. Another propaganda piece I broke from literally just now. “We’re the largest country of consumerism on Earth.” Okay, create jobs to make the products that we consume, and maybe we won’t have the propagandized “crisis” politicians are portraying.
youtube
AI Jobs
2025-12-23T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwJVFhEAUIQFqsBA-h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyCrfP-u0bCnMThxz94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwkBv16azEDGak8Twx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwQp0NLIBFUJD7X_gV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxsOFfyoBfVoO1NKlB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_Ugx0C48P5I64YuGzKMh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugx3bMkRGq39uderXfB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgwSxniyG0tJigw-WDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"skepticism"},{"id":"ytc_UgypohBxOP1bOeBES194AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgwX2Rg2f6D-eZUrg8N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]