Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wow, incredible work — next, maybe try convincing Siri to confess her sins. This…
ytc_UgxZVs8Ww…
G
If you are insecure about "AI art" taking your job, then you were never making a…
ytc_UgxPIAY5G…
G
I think it's "funny" how some complain about feeling lonely and for having to go…
ytc_UgwbyHKG6…
G
At first you caught me, but then I quickly went to investigate 🙊😂😝, it's the AI,…
ytr_UgzuwJm86…
G
They will hire cheap labor in India. This is just a measure to replace one Ameri…
rdc_ohuudfj
G
Thank you I have been trying to put this into words. So many other countries wou…
rdc_fwhfgm2
G
Yeah you can blame racism all you want but China made a facial recognition AI an…
ytc_Ugzu7CVKd…
G
No way should a person be put on hold and have to listen to a robot recording in…
ytc_UgxdUAV-X…
Comment
To be fair, AI companies being forced to license copyrighted work to train their models on them IS regulation. It means AI can still exist, and artists can still profit (in fact, that's a built in easy and guaranteed way for artists to make money from their own works too, something that has NEVER existed for artists), no theft and anybody who doesn't consent to having their work train a model has nothing to worry about. If you ask me, Disney setting a precedent that AI companies should have to pay licenses to any copyright holder who's work has been used to train a model and therefore the model is able to create the same copyrighted work (with differences albeit) is arguably one of the best possible outcomes.
youtube
Viral AI Reaction
2026-01-05T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugw6JK_6lCrZtSrIXsh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx8ri5cUDt0DkxJZ9R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxFiifaYCcsJXvzkfV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDHOSQkvXaPxdweC54AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwHHdT-zeG_f-Yh2_F4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgywgF8xTTPP-ZKZUWV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzn4Jboi2guOEx-WEt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxF2Jd9929vzhRwryR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzcW94iNslWuAt6pat4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwzWgjXyUqSAwBplAd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"})