Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wait until he realizes that AI had to learn for years to make “good” “art”…
ytc_UgwYO_cuK…
G
Me too lol, my dad’s like “what if you’re not good at art and you wanna make som…
ytr_UgxtFhUCI…
G
…private equity are using the tools of gentrification and overzealous capital to…
ytc_Ugwff0q6I…
G
Watch this 10 minute documentary on YouTube called “humans need not apply” it’s …
rdc_glihr0q
G
It is an abomination. You don't need a replicant, to have AI helping to human en…
ytc_UgxnitwC6…
G
What happened you is horrible the the ai community is simply embarrassing, so un…
ytc_UgziHjVnL…
G
Okay, now this is getting outta hand, they're taking the jobs of millions now th…
ytc_UgwoI4f89…
G
Man i don't like unethical ai art, even more so ai "artists" i get it if it's fo…
ytc_Ugy7q8vuV…
Comment
Listening to the video it sounds like A.I is fantastic, but it can still suffer from a very human flaw. Once there's a mistake, a deviation then that mistake can be if not inevitable lead to increased and bigger mistakes. It's not perfect. So humans with appropriate knowledge and wisdom of the subject matter need to be looking over whatever the AI was instructed to do to make sure unintended consequences don't get inserted into the process and end results. While A.I has awesome and fantastic benefits it has the potential and already proven to have disastrous consequences as well. People still have to be apart of these processes.
youtube
AI Jobs
2026-01-10T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx5Svz1QIcCnfLwVM54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzbKmRgaz-eGTRStEZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwjcmjZQadIwzXkXm94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0dYg0tbVDLWrPJuZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyFg_1TTaWmjnDDqLV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxcEU2LLtz2SAZhveZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyMbCiZgVlKM1RzlWp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxxDyFryE81IAtw6Wd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHFb7v77zAOMFMdj94AaABAg","responsibility":"industry_self","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugy9nv6JMj1s4pwMh4J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]