Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But I've been making left turns followed by right turns without an AI computer a…
ytc_Ugzbf_sYb…
G
I bet there will be protests in the future to stop ai art because if deprives pe…
ytc_UgzYxGS8M…
G
It happened in San Francisco, open AI pretty much owns San Fran. I want justice …
ytr_Ugxd3sBZj…
G
Tesla optimus walks like a caveman 😂. Xpeng robot walks more natural and good bo…
ytc_Ugz8SyOTX…
G
If AI doesn't want to control us, why does it keep portraying us like puppets?…
ytc_UgyUdHE5R…
G
I can understand your concern! The interaction in the video highlights a crucial…
ytr_Ugw7S5BwY…
G
If you think dragging pedestrians underneath the car is just a robotaxi thing, y…
ytc_UgwGNgUym…
G
It was a brilliant interview.
Unfortunately, she missed the last question about …
ytc_UgzEVvVAR…
Comment
The people who post their art consent to inspiration. If they do not consent you will see a written note in their description or caption which says something like do not use for inspiration, in which case you can make art inspired by their art but you can't post it online or sell it. Some allow it but require you to credit them for inspo when you post it. Others don't care. In the same way, artists also want a say about whether AI can use their art or not. If people who shared art publicly knew it could be scrapped by AI they would either specify that their work must not be used to train AI or not post it at all (at least not without an AI poison filter.) If art theft didn't exist no one would use watermarks before sharing their work online. How were they supposed to know that AI was coming to steal their art. If they knew they would have at least used the poison filters. Which also doesn't justify the theft because a watermark is just a means of proving the work is yours.
youtube
AI Responsibility
2026-03-22T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7kc2faXNd6mlrTfF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxxd4N_ucGBex1EBuR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyA8wvA4OdI9ekfedt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyV6t6AwBsXnTKhIyl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyjSYfTeFHAspSlBdl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwzUP2__1l_foLNddF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxdPmzJ4kaOLUAzaol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyNkdI1JnM1vhkVXzh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsG8cNi9eFLJZCbbt4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXziK4VAANPuFYcT54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]