Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly as long as AI is using real art without informed consent, I don't even …
ytc_UgwHhL_u_…
G
The hypothetical scenarios where we keep getting headlines like "AI does X" as F…
ytc_Ugz7FWYfb…
G
Assuming that the AI kind of recreates something it had been trained on, chances…
ytc_UgxXz_8Zy…
G
If AI was being developed by only the most trustworthy, noble people, with the b…
ytr_UgySnjkGV…
G
As a digital AND traditional artist, they both take pretty much the same amount …
ytc_Ugw5Gbu8O…
G
I never even thought of that when thanking the AI 😅
I write politely out of hab…
ytc_UgzUU44KN…
G
@whilpin…you just compared an AI “artist” to someone COMMISSIONING art.…
ytr_Ugx8fCyMf…
G
If AI gets to the top of the stairs and wonders why it's there, then we're all i…
ytc_UgwIYGh6-…
Comment
The term 'AI' is simply a buzzword.
If we are to ask ourselves the question in a different way we might ask
"Do you think 'technology' will supersede humans?" or "Have we lost control of 'technology'?"
The answer is of course that we already have.
People are addicted to mobile phones, TV, gaming, tech developed synthetic drugs, data consuming etc etc.
Humans lost control over technology long ago and are paying the price today with poor health, mental health, poor living conditions and habits, terrible resource management leading to extreme environmental degradation.
You think AI may have taken over... How much time, how much of your life, do you freely donate to the modern algorithmic dance that challenges our survival on this beautiful, living, world?
youtube
2025-11-05T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwxaJZEm3juoZaNupV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzK7qsLJb6kzjQWrrl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzOeiD0a0EEq7ar6W14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx35nPulgDLVdrX8fd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyijxX2cSWZBN6wDcF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzZONlSC3kMOhH381V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyX1tEn9VYp9PgfGzp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzI6Uxz8ri1be0xYhl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZXdKNeRQp3M7QDvl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwhM1fYuARgHUmIVBd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]