Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI should be used to fulfill humans shouldn’t or don’t want to do. Factory work.…
ytc_UgzvLZaFh…
G
Self-driving technology has to be either perfect or banned. Having a bad driver …
ytc_UgzvT7QXB…
G
All these people are not taking about AI at all.
They are using it as a catch ph…
ytc_Ugxnp-_gH…
G
Just wait til people start to see Jesus in ChatGPT. Make it a religious thing…
rdc_mlhlj2q
G
I think vw do better when u dont put ur hand on the steering while it beeping it…
ytc_UgzEL6ITK…
G
Unless there's a way to stop the others that don't comply. An AI to slow AI? How…
ytr_UgzNYd6Sc…
G
the only way I would count them as artists is if they did everything from making…
ytr_UgwYpwY4Y…
G
I don‘t like AI. The first way is to tie AI to the cross with nails and bake it …
ytc_UgxXjaj4S…
Comment
No it's not fair use. For one, OpenAI is allowing their product, which is trained on the data, to be sold to people for a cost. Doesn't matter which way you put it, but open ai needs to come to an agreement with everything that has been scraped for the training data. They are using the data they scraped to sell back to companies for these "prompts." They use their system to come up with computer generated answers based on the training data.
youtube
AI Responsibility
2026-04-11T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgzHTelq0s2qpaCqHBx4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwDwSTBUCwkOx1E2CZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxcPlZ1GZSnR73ccMx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzRVNaXCC3N5fTSh9F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgziyyxqyuladkQvZg94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwX38lg2ohoVfTNf3F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQt4Or285ozy6CWvd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwmPaSxht3cKh_MJCh4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwGNcuvLutoh24xvTh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugzbh6dVdW5wgoaQAA94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]