Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If only there was no stack overflow and github and other sources for AI to learn…
ytc_UgwbHSO9L…
G
I really dont like "ai" it makes me. My friend actually tried to defend it ai ar…
ytc_Ugw1RYrIi…
G
Automation voice acting makes it much easier to do fun things as well, though, l…
rdc_lgxaocr
G
The speed of AI development is slower than our own nuclear programs in the 40s. …
ytc_Ugz4_SVCh…
G
Andrew Yang (2019 Democratic Presidential Candidate) has been sounding the alarm…
ytc_Ugxp-jVF8…
G
Without purpose there is no action, so we're safe for now. Yet to make true AI w…
ytc_UgzOFBU6w…
G
It is a fascinating dissonance to hear a 50-year veteran of AI express shock tha…
ytc_UgzncXg6m…
G
No truck driver, bad move, AI aren't road friendly, don't stop to offer help, do…
ytc_UgxGFlHvL…
Comment
Even if all Western companies started asking for consent to use images in training sets ... China would just steal them anyways, then people would use those better models over the crippled Western ones. There really is no solution here. We're kinda in a post AI world already for better or for worse. I personally think human made art will become even more valued though. Because the utter garbage that is AI art kind of elevates human art. You're seeing it already in games; people will happily accept a game with crap art over one with AI art
youtube
Viral AI Reaction
2026-01-05T07:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugze4yitfNdVPjurpF14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQgsBZmcODKErFyyx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy2yzL1Qze40AQ-IG14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwlbSJRs8VWskTQ3oN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz8bI2xVE53e9YI7VR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOuTFQOCLY4C29tO94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwnhDf4vt59wTe4Ch14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxg_lrlz7PQG_NBDpR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgySh39NS1L3QJkdV2h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwN9WfJxNWpn7beJLh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]