Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What we need to do is not think about it, allow it to happen. Make the AI robots…
ytc_UgytXOX3u…
G
That's the most awful AI voice I've ever heard. Who programmed that? AI generate…
ytc_UgyAWP9vp…
G
Well, I'd say it has always been obvious that, when AI would've been invented, s…
ytc_Ugxpow7uG…
G
Unlock a world of possibilities for your business by harnessing the potential of…
ytc_Ugy8AvSso…
G
I'm confused. If you had an eidetic memory and read numerous copyrighted works …
ytc_UgyRD668U…
G
Fools. AI & AGI knows NOTHING about the occult lol. Occult means HIDDEN btw, and…
ytc_UgwIB0rYL…
G
In reality, many clients prioritize project completion speed within a limited bu…
ytc_Ugzkvoh6m…
G
People forget these ai are impersonating objective perspectives of both parties…
ytc_Ugw8kN0LM…
Comment
Social media platforms get 'free oil' on their users' data, so similarly, the AI Companies want the same freedom to train their models on existing creators without paying for it. I doubt even successful court cases will hold out long against these companies' financial and political clout. They'll challenge any reversal of influence and tie up any plaintiffs for decades in courts as their AI Tools achieve such capability that human artists become obsolete, employed to approve of what the machine mind has created, and little else. No expensive outlays to artists then, just Producers feeding in AI prompt words to generate any content they desire almost instantaneously. Against a machine like that, human artists look like the Luddites, English 19th-century textile workers outevolved by technology. The future is grim for any human agency trying to stop this shift in high technology.
youtube
2025-04-11T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyfyv0zCp1JTKEtDsR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwGoq8W-I43lo3qk-p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzmz79eUp95cVju1Bh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwCHAzL9T__2RicPLJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz0RfsmTAny8rn3J-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwO99uFNfZV8K1oMOt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxURZ9pzxQ3bdQU1354AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZ_sT921sjFUZKNHB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyz8S7C_hFpjWsMTYx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwWiyVlXsqNSnl0ZBd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]