Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Generative AI is like getting robots to play basketball. You can do it and it mi…
ytc_UgyKZRSKb…
G
Ummm the AI bros do know their rant of her being "arrogant" and "her art isn't s…
ytc_Ugzo7PVX9…
G
Robots and Ai will replace a lot of jobs. It’s inevitable. Andrew Yang tried to …
ytc_Ugxa6Ug8R…
G
Let’s stop purchasing technology. Technology companies have thrived because of o…
ytc_Ugw8IPbVH…
G
Regulation requiring a blatantly identifiable watermark on AI generated content,…
ytc_UgwMcy_CX…
G
Omg the world is going to be totally swamped with AI written trash books isn't i…
rdc_myjhxec
G
Steven Hawking's brilliance at his expertise didn't exactly translate into brill…
rdc_jifbqbw
G
Without proof although I wish I had it.
I can tell you if they have access to yo…
ytc_UgwXaSZUi…
Comment
For Artists it should be a choice of "Opting IN" NOT "Opting OUT" as in. If the artist chooses to allow their work to be assimilated by AI they can choose to do that ie. "Opt In". Not "OPTING OUT" meaning it's currently possible & even likely that when an artist uploads their work or creates an account they might forget or miss seeing the button to refuse AI database inclusion which is what is currently being used by several platforms I've seen. As an artist generally I know we are excited & nervous to share our work with the world but having regret & anxiety over accidentally feeding the AI machine shouldn't have to be part of that unless purposefully chosen by the artist.
youtube
AI Responsibility
2023-12-17T00:2…
♥ 284
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzt5-UvSyO5E0HyFUp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMLkvlge943hVPVix4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzjkUg_Y3Chs3DEwid4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxu9qPBKqXvuaYFxXB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxbYoUaQWDVM0InxMN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzb7kdmG9qtdsun4up4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyNVccEFRafUNKo-Tl4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz_tXyktcM3xPjc0jV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyw01zZjs1NmPJ85Wl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwUZECe-XRg0OP5enp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]