Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the owner of the car should take all impact. This should just be the con…
ytc_Ugg8HfmGm…
G
Anyone who has programmed something has worked with Artificial Intelligence, whi…
ytc_Ugz5rW3cz…
G
Talking about how AI is gaslighting people and then promoting AI sponsored produ…
ytc_UgzFrNQta…
G
These CEOs are issueing "warnings" are doing so to overhype their own products a…
ytc_UgzFRIZAO…
G
I’m impressed by how BroadWatt’s vertical data centers use modular tech—scaling …
ytc_UgwHZbYui…
G
I have an AI that controls my smart home devices, it plays the roll of Jeeves fr…
ytc_Ugz4H0e7h…
G
I remember seeing this video on how two AI was set up with a prompts that they w…
ytc_Ugw2nnMGu…
G
Robot enthusiast here
Please don’t associate AI with the inevitable future of r…
ytc_Ugybz-a2F…
Comment
14:01 What hes arguing for is a slow death instead of a more sudden death for artists. He said that licensing will take time and slow you down but ultimately it will create as powerful models we have today but they will cost more to build.
That means that you protect the current artists but throws the upcoming artists under the bus since they wont get any licens money while competing with AI models, over time it will be harder and harder for future artists to make any money.
The argument would be more coherent if he was against AI, this talk felt like a lawyer, who represents artists, was talking
youtube
2025-04-24T08:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgykwUlT5zj0Hm9vAp94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxqS7gztEwRC1DBmmp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy0ryeza2oEFqlMcBJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwNm16tE9PETV1TGJp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxhnfn_asJGkv6O8x54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzjt5tPW2GS0BqOR8Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFxxJ5n0vaAx8vOK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxiHerhyi3lFqHrU3R4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz0HPUe4zx874lqnJl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxJy5LbFo9tL_oqbZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]