Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is a really bad warning, imagine if someone can get a medical license using…
ytc_Ugwd3eHMx…
G
Stupidest thing is the software never said it was a 100% match, it said somethin…
ytr_UgwmZALpX…
G
Humans are to blame lmao. AI wouldnt get funny ideas if these damaging thoughts …
ytc_Ugz7Jcb-W…
G
It doesn't make it any easier to go to war, because the people who want more war…
ytc_Ugy9m0mE6…
G
That ai was programmed by a person so idk, how should it know when the ai has no…
ytc_UgyMEISlx…
G
Haha, that's a funny take! While Sophia may not need skincare like humans do, he…
ytr_UgwzkmpD0…
G
I fiddle with AI art on my own PC. I'll never post it publicly, but it has been …
ytc_UgyPv2W2Q…
G
the ai art is worse then false copyright claiming on somjeone's art poster becau…
ytc_Ugwcp5YqX…
Comment
Thinking AI replacing all jobs would be a social catastrophe because without job people can't afford living is american mindset in a nutshell. If AI replace all jobs, you have a functional society, food and shelter for everyone free to do wtf they want. Issue are greedy people that want more than other to feel good and won't accept everyone get the same share. Also how you distribute property as some are better than other. But human don't need to work to live, that's not the purpose of work.
youtube
AI Moral Status
2026-03-10T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwcMhA2vly9pNH5W0B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzYJM5lFLApdr8MAqR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgweJt91u6UxmoANhMR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxBHEX0P9NxopVMbCN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyfHR2x2jlcKW_SwZ94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxVDR_xE3WXoSijKTN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEpe7rFDzrMqL3n3l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxdHidkzkLmhlE0O-V4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzuX3MZdNnyB_NPmx14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwDLnHqp4BEG0osbOF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]