Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
with the way everything is going, getting a ai like the thunderhead from the Scy…
ytc_Ugz5Bh0Oc…
G
DARPA had AI in the 70s. The AI we have now is meant for the control of us…
ytc_UgxcbI6QJ…
G
Ultimately none of our opinions matter with regards to this topic.
We aren't t…
rdc_oh1kvda
G
These posts go hard but will y'all just let ai exist, this guy literally said it…
ytc_Ugzco5CJF…
G
We have got to get over this capitalism and competition bullshit.
The vast majo…
rdc_d0fufzb
G
AI will be ok if we never let it control anything. Suggest data only. Don’t let …
ytc_UgzGztfQz…
G
The only kind of AI I really use is something similar to Character AI, it's just…
ytc_UgxdvRcDa…
G
It seems like you might have been trying to make a joke! Sophia's perspective on…
ytr_Ugxd7T8Ih…
Comment
Thank you for this great talk, Ted
Please forgive my tearful academism.
I agree that artists should be asked before their work is used. But perhaps the real issue is older — and deeper.
For more than a century, industry has fed us mass-reproduced art, often borrowing from everywhere, including times when copyright didn’t yet exist. In doing so, it has slowly replaced local, personal, everyday acts of creation with passive consumption.
AI might continue this — or, strangely, help undo it.
I dream of a time when people, with more free time, will create again — not for fame, not for money, but for the simple joy of making.
That, to me, would be a quiet and real renaissance.
youtube
2025-04-13T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxvcHb4Vde4mMxiNSR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzqjKHaYtMFOlrGo7B4AaABAg","responsibility":"industry","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgysmYUJoFDbz1LZSmV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyjQiiv6_h6Hn9LIDZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwpFm3GMA8AtmYRVHB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLrmmEzx0GmwhSa3t4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwLXYtOuaxvub-J_Yt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzEqw3Wd37fn6t9YH94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxBFdAoVjQrY11TJJx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwg8CJfojk22QSfhR14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}]