Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@olivercharles2930 1. using a reference and taking this reference (such as trac…
ytr_UgzMhqKdY…
G
After listening to many of the podcasts on AI safety / existential risk I think …
ytc_Ugy5pVyQG…
G
There is a small percentage of humans who have understood and talked about AI ( …
ytc_UgzmoNDDv…
G
Not anymore! Now the brilliant ethics nerss at OpenAI have Dall-E 3 set to popul…
ytc_UgwXWlw96…
G
AI is already schmurding people and no one is talking about it that way. This wa…
ytc_Ugzz0Jhhb…
G
One of my friends used an AI to make a Studio Ghibli-esque rendition of a group …
ytc_UgwNutAqQ…
G
Whoever authored this clip is talking complete nonsense when at the end its clai…
ytc_UgzgzSqet…
G
This is what’s being missed , the IR6 contract relevant here is AWS/Palantir clo…
rdc_o8168xb
Comment
The reason they dont have an opt in system is because 99% of people wouldnt opt in and then they wouldnt have the AI Art. The only feasible and ethical business model would be to offer money to the artists per instance of the AI using the work and then having some kind of subscription service for the consumer, kinda the same boat as apple music or spotify to continue the comparison between audio and visual art.
youtube
Viral AI Reaction
2023-01-02T22:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzMmxOTSWHh_AbybPx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxao4Tpgon3BwzxnyZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyKirazB4FLM4qabHd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugzgp6lbZTzvYyzP0eF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyDFENbi0UNHELQRU54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxS6Is_S7X8gbzS7Pp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyVEoaQKNOqf0kCCYp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz27OZIXPovKQOlLZ54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgykolYl50Bl5tpyPRF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyzVYTzW0EXoSyHRNJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]