Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI 🤖 will be required to save our planet in 5-10 years
We will be faced with C…
ytc_UgwIQtMSR…
G
In a way, we're also 'stealing' from artists since AI's are based on the human b…
ytc_UgzG-6QoY…
G
Thank you, very well said!
Modern society (and especially "AI") wants us to feel…
ytc_UgzeUUZ2m…
G
There is no programmer. That's not how modern AI works. It's a neural network; a…
ytr_UgzinhIp_…
G
The problem we face today with AI and many other global issues is scholars are b…
ytc_Ugz_RIRL4…
G
@rookd2067 its fine if you know enough about the topic to fact check if its righ…
ytr_UgxwGbd_F…
G
I'm a machine learning engineer and I've been working in AI for a decade. The PD…
ytc_UgxEpmyfN…
G
i don't really see it being good since the eyes are just static(not moving/expre…
ytc_Ugzk0r_oE…
Comment
Paul Virilio said when we invent a technology we invent the related disaster - we never stop to think about that, we lunge forward regardless. We know there will be a disaster but for sone reason we wait for it to appear before we do anything about it and then cry that we didn’t know. He also said in the 80s that that the internet would have profound impacts on society , economy and democracy which is turning out to be true a highly complex disaster which it seems only he thought about. And know we have goons training LLM on the web. What could possibly go wrong?
youtube
AI Governance
2026-01-11T07:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzesVC2fNbzorJKwEZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy3xQz8zKD8ApbXqrB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyx48xRodopeGKrvFh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzqugwni3NHFQBVHFJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_giddtszVAZIAyeV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwo8TctWPQa3lB_whx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy6vSx83zfa_kPliLF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqStuTdkR57OWbGiB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz0PS3HBIl8EDk2F354AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz7bkNKsymd12Igu4B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]