Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I do wonder how “performing” with AI will work. I find that a lot of what makes …
ytr_UgxafUkGJ…
G
Ubi while great in theory is a temporary solution as robots and a.i. dont pay ta…
ytc_Ugw8qkj2P…
G
All videos on youtube teaching from construction to cleaning, you are actually t…
ytc_UgyKqa3e4…
G
Glad you enjoyed that part of the video! Sophia's sense of humor definitely adds…
ytr_UgxSCa8Jj…
G
I find it funny that people complain about ai while breathing in smog. They worr…
ytc_UgznZB_az…
G
All I hear is blablabla........make billions of dollars....blablabla....investor…
ytc_UgwIdCJUO…
G
"Hmm I wonder why this corporate news agency is talking about real issues for on…
ytc_UgweJjoxx…
G
Looks like predictive policing is just code for domestic terrorism this reminded…
ytc_UgwzNiXjr…
Comment
I love your channel. I'd like to ask you if you could help review what constitutes "legal" when sharing information to models for training.
I don't usually read terms and conditions (🙈) but when sharing words of art, like books, or paintings with the creators of the AI, does that equal giving them rights? The model in the end uses the data we give to learn. But what if we don't own the data, or the AI creators don't own it. How does that work?
Thanks for your videos 🙏🏽
youtube
AI Responsibility
2023-10-09T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy6UbF7-QfYlToQ4rB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxcpxG_oRt7PUVtUN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz_doDTa-Qd_15ndO94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_rMOJaR-21jtY1714AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5nuFD8GqsUh4HhKx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyMIoBceeG-wEiObPh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy05jhdxuIK6wix65V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgwZkyu7CcQh4tiC10J4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxaASXT_2gwA6BDdyR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwe8d5QWYEv2sX8-_J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]