Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder to what extent the things important to human like love, friendship, com…
ytc_Ugz8TKA8O…
G
DONT LET CHRIS AND CORY NEAR THE TIFA AI OR DAVE NEAR THE MIKU AI…
ytc_UgzWo6mRw…
G
I think the timeline is off. As of today AI cannot reason and depending on who y…
ytc_UgwabpL54…
G
I just don't understand why barely anyone thought about using AI as a search eng…
ytc_UgyRq1o67…
G
We appreciate your feedback. In our live broadcasts on AITube, we showcase cutti…
ytr_UgyOX3HPt…
G
honestly i think only transhumanism is the solution to ai deserving rights and h…
ytc_UgiXwA6zw…
G
Code autocomplete isn’t that great but AI Agents are *way* better. They can catc…
ytc_UgxaEPrTx…
G
You know hot take ai art should be put into some peoples process i mean when i e…
ytc_Ugxipbr1c…
Comment
I don't believe that the technology itself is inherently unethical, but I do believe that greedy CEOs and corporations insist on taking unethical and even illegal shortcuts in order to make more money. You could train an AI using only public domain art, or art that has been commissioned for that purpose. But they want to scrape anything and everything in order to make the models "large" enough to make flashy, highly rendered looking results. The reason (if my own research is accurate) the previous versions of these AIs made such bad looking images was that they didn't have a big enough data set.
If some big tech CEO wants to start paying artists to create work to train an actually ethical AI system, then sign me up, I could use the cash. Until that day, more lawsuits while the market goes from bad to worse.
youtube
Viral AI Reaction
2024-09-16T14:4…
♥ 116
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx9W7EnI8N8XYLI9ix4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyQ-SCnhksejLqhLS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzipBEmrSC8CAEGK5R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyFmpOrUHz6BEtC1jl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQtLDdZLqKNtz1EDl4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqfMcvYDDpxiVZekh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzrTQP36J5Ke4QtAnh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxFfTC9pwjZYE30iOl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzq_EDr82OabjVd_b94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwvuwd-Dckf7VA-Ph14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]