Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Jo, the AI looks out of it's window and sees two man fighting over an edible roo…
ytc_UgzAErZfX…
G
Im the innocent one. The chatbots have broken the filter multiple times, and idk…
ytc_UgywTJlKq…
G
do you want a robot uprising, cause this is how you get a robot uprising…
ytc_Ugz9NyFkn…
G
We humans throw EVERY SINGLE YEAR MLLIONS OF TONS OF PLASTIC in the oceans, i th…
ytc_Ugxnk8-UZ…
G
LLMs do not have:
a stable inner agent,
a consistent set of beliefs,
goals th…
ytc_UgyarHndj…
G
Let me get this straight. First I make the monster and now I am here telling eve…
ytc_UgyNuk0-J…
G
Already getting frustrated from ad breaks too often. The better podcasts allow m…
ytc_Ugyda1qoQ…
G
Sorry, but this was too extreme to take seriously. He lost me with his simulati…
ytc_UgykP3n9t…
Comment
AI is for doing tedious tasks, like being trained to look for cancer cells. That's great!
But why would anyone want to remove the human component from creative work, that I don't get.
"You just want to gatekeep art from people without talent!?"
No, what the hell is stopping anyone from just picking up a pen and putting in the work? Talent is pretty much a myth, don't be lazy.
(And imo, the creation of a piece is just as much part of the art as the finished work.)
Lastly, yes. The way these developers train their crap profits of the work of A LOT of artists with no consent or compensation. It is theft. Pure and simple.
Sorry to be such a bummer in the comments, love your videos 😁
youtube
AI Responsibility
2024-07-28T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgypcpSKipPCVMZ9pMd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycOiazRa7tdrPuH1J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugytbrcjr5FFTU480Y94AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzepeog7PUpgY0OwRp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxy5SQNOxYeLJsoMWB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzue64WC_GNNtfZ-KJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwf-8gUB83KLqNBD_J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwEVkD5vsOPjy00ikR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4c_uQsIKuIlFDMYN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw3QRWRoUUzVrTQYNJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}
]