Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I believe humans are not suited to share their existence with AI. We have our ow…
ytc_Ugw9TXVwj…
G
And before you know it no more human workers that's how you get put out of a job…
ytc_Ugx5YVnaT…
G
I don't use AI or cloud storage. I encourage those who care about our earth to d…
ytc_Ugy_-0C9L…
G
When your passwords make it to slack channels and then your stolen tech reverts …
ytc_UgxgChD8N…
G
If moral obligations governed anything in this contemporary society we would wak…
ytc_UgxbBuxOv…
G
Art and music should be left to humans and not AI. It's what makes us navigate t…
ytc_Ugxd3QY59…
G
The only protection fr9m the rich el8m8nating all employment is to set up laws n…
ytc_UgyX2ybOK…
G
AI is not a scam, it's just that we don't understand how consciousness works yet…
ytc_UgzcHwIHl…
Comment
A question that is yet more relevant to me in terms of Ai and copyrights is the following thought: if AI is authorised, by default,(new law in the UK) to learn from our music creative content, including our lyrics, and our voice and that I have to opt out to not authorise this, then what happens if anyone else out my music on the net( as copyright infringement)? Sure the artists could ask to put it down if they see that that bothers their marketing strategies somehow. But what about if the copyright infringer didn't opt out of that AI authorisation? That would mean that AI would already use the content and be allowed to make a copy of it, in any variations we know AI capable of, and yet legally!
youtube
2025-02-09T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxtLcsdYsZA3ov3eZF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzK2TtkrDfSiRtMA-Z4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxxE3ILyWeLPiVYzw14AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxyfffj49j4WP7iJ054AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyD3B8pRTq5GRBxddJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx53304JXboWRa-6bp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyj7hjGh1DxshbIUr14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy3n8d0h0Zt03taASt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy_1yPQ1NgkCMAbO_Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugxzev5nClotmMvF5S94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}
]