Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This makes so much sense and agrees with my experience as a software developer u…
ytc_UgzvhFpBS…
G
If anyone wants to see a possibility of full automation with no jobs for people.…
ytc_UgwEfVEYR…
G
Look I think Elon's example for bad AI logic isn't the best, but the point he's …
ytc_Ugw2KNIeN…
G
It infuriates me that people defend Ai for art. Companies already use it, client…
ytc_Ugx49-YK4…
G
King of global ai types is me, not your no show Elon Musk read em and weep? The …
ytc_UgwQ57DSp…
G
Honestly, for me it wasn't much about shady deals than it was being a poor produ…
rdc_o7w0wjm
G
When AI can identify the crosswalk in a series of nine bad pictures, it will be …
ytc_UgxKb30ni…
G
Can't remember the man's name, but he's called the godfather of A.I. In his inte…
ytc_Ugz8y7g_d…
Comment
I think the big companies like Google should have an "opt in/out" of AI training option before you upload images to the internet that blocks AI models from learning from your work. Or AI blocking mechanisms could be built into images somehow. I don't want AI art to go away but I understand your frustration and wish it was more ethical.
youtube
Viral AI Reaction
2023-02-13T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz18XLetMPvwpG78tx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz3TCUIFLO_xnRsReh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyshUszGHXdovKrHox4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzH0M59-D4tv5jg-WR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwcocFVoEfK-VMY6fp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy99jsUOluOXbWcIc14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwKKgUru94wWq80VB94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxzpLoDcxJh9aScNOB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyRYHeUqhwm68sPOPd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxSRmY3OMoty1PaoaR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]