Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
THE AI TRI TUE NHAN TAO THE OBAMA CARE CAU KET SCAM HACK THE 9 CHENE & WEB W…
ytc_Ugxlf5AMF…
G
Sounds more like a people problem than an ai problem. Should probably work on th…
ytc_UgxskHHix…
G
@TheUnlikelyProphet1If that break away Glacier , the size of Florida , melts , …
ytr_UgzAf8foa…
G
Yes exactly we shouldn’t call not even call them artist we should call them ai g…
ytc_UgzI-SG6E…
G
Once robots are able to respect property rights, they automatically have the rig…
ytc_Ugj-XchBu…
G
@jpmor7327When everything got automation human will free from their work and do …
ytr_Ugxq1QP0r…
G
Universal Income (beyond basic) should be the goal with AI. I rather not work if…
ytr_Ugxg5lseU…
G
I say, let them hire exclusively “AI employees”. They’ll soon find out how badly…
ytc_Ugz_ygx1y…
Comment
AI is just way too expensive to train. The ethical way would be a company like Disney would buy its own model then train it on their data set so no copyright infringement happens. But training AI takes forever with large data centers sucking up electricity and water for cooling. Then it also leads to stagnation and less experimentation. When a new style is developed, you can't just add that to the already existing set. You have to retrain it from zero. Its cheaper to just hire people.
youtube
Viral AI Reaction
2025-10-22T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxJrIELUKG8wP4JK-Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxKCcCi3wV5VhAzPRZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy_LlfOfy2Ka1essQF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgxCc0ZAz23OB2bWzlF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgwIlTaFwsb0e7__dn54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgzdUyEaRI4h9yRRoS94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_Ugw2oxqgwYw99xAUce14AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugzn5aYt1C0Jbx-Cbg54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyetqZVUPKsA4pAZot4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_Ugy7JIt_6MbymPi6dwV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}]