Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
u dont have to use ai, you can just practice or use tutorials from other artists…
ytr_Ugy2Hmdmv…
G
The AI will take over the world and it's due to greed from a selective group of …
ytc_UgykK8ccW…
G
Man, fuck robots, Robots literally have no empathy, it's all algorithm with a fi…
ytc_Ugj4vS6AR…
G
Don’t think he actually understands A.I. if he believes what he’s saying. It’s l…
ytc_UgxbeT-Il…
G
Rule 1
Answer with only one word
Rule 2
Be simple and direct
Rule 3
Hold nothing…
ytc_UgwYs8it3…
G
Bro how can you even call yourself an “ai artist” you did nothing but enter a pr…
ytc_Ugwne1LaG…
G
Thanks for your comment! It's interesting to see how quickly technology evolves.…
ytr_UgzHM9tsm…
G
Another AI doomer video. sigh. AI isn’t that great right now. But it’s also not …
ytc_UgxmKzan0…
Comment
We already have AI in the form of a huge set of social rules, regulations and algorithms, which is a real technology that turns the behavior of people in society into one thoughtless mechanism. A chatbot will add little to this power of artificial social intelligence, except to introduce even more organized chaos and unfair decisions into it. And the field of free thinking for people will narrow even more.
youtube
AI Governance
2023-04-27T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyD0kjcSazMyZFF-ed4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxeufd7gMXKY7PQcE54AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxLkTsA6HU4qGBoxBt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxzkdo1EMWzh5Mdu7J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxepxPFKY02MAc3Vsl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw9PE4CZRyV8gRIzch4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz2Uy23V-Z_KtnkA2J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzcQpMfzXcVtRv_iL94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwBgjmftahMH1MuMmV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_pcU0hUDed1GOkvR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]