Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like AI art supporters don't actually like AI art. They just for some rea…
ytc_UgzKoDfzw…
G
Adolescent AI - Anthropic, rights? Children don't get Rights till they grow up, …
ytc_UgxyIGdO1…
G
I do both and I hear you, but for me drawing for pleasure and drawing because I …
ytr_Ugwt9dWfe…
G
Tbh I’m pretty positive the reason why AI scored well on divergent thinking is b…
ytc_UgxgRRJet…
G
That's why Waymo claims their Lidars could save lives; cuz they sense all object…
ytr_UgxxNlc3H…
G
Ai art should have just gotten a delayed released. But ai companies need the mon…
ytc_UgwYustB1…
G
It's insanely disheartening when these AI prompt bros just happen to be my paren…
ytc_UgyoQJyjz…
G
A decade ago, I was studying an extensive degree in a private university for Ana…
ytc_UgwPtBqp7…
Comment
How about using the money from AI to fund childcare, public schools, health care, senior assisted living, art and music, and things that are not seen as profitable but are necessary for the well-being of the society? Instead of trying to have AI teach our kids, why not train more teachers and give them higher salaries? Big companies replacing workers with AI should pay some AI taxes to fund these so that the society can benefit from the technology, not just a few billionaires. There should also be regulation about AI in art and music.
youtube
2025-08-25T13:4…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyOr0Ef9puXbvr2HNl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_WgppNpMAvvDIT894AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz9TItLQZbNr-De_IZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyd_jhSPHFcpWedvLF4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugzox6ZmbRhvSbDlxXJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyVk2En78_AUneogv94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZXROKT3we6SlUc4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwXOWHrEsMqCfoy2jF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzL5AbG9Jf6aCLg7L94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyoFUb0J2Yu8VfgDRh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]