Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe with cameras you could make a self driving car equally safe as Humans. But…
ytc_UgyP0c1rc…
G
I don't trust humans or AI to make decisions for me. That should be MY choice. t…
ytr_Ugxc2Rep3…
G
Another side, if an AI truly is concious we'll have already committed genocide t…
ytc_UgwoyQYkc…
G
Just an FYI. Geoffrey Hinton still holds all his stock options in Google and suc…
ytr_UgwoBoRVb…
G
Although it might seen like it our brain is vastly more complex and processes ev…
ytr_Ugw_pC8U9…
G
The way in which you speak about AI in learning seems to lead to the idea that i…
ytc_UgzkUl5X6…
G
AI is a cheap, lazy, unsatisfying way to make art, and I cannot contain my hatre…
ytc_Ugz4K3N14…
G
If you want to protect your trade use physical mediums. Digital art is lazy and …
ytc_UgyDlgPF4…
Comment
I work in a VFX studio and people are using AI for storyboards. Which means directors dont have to think about what shots they want anymore. Which just means the AI is telling the humans how to do their job.
Expect movies and series to become hollow and lifeless (not all but most).
youtube
Viral AI Reaction
2025-02-20T15:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxShiq0PNzGWtGg9gZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw2-wtuWfDLuOjS8WZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwXJu7XpNyygVKnM7N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUmLzoDhld5lHhhQN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw7cKmETC8zBQhXijh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugz_RHy99xhQZmlrjDZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx3bU_WLWoz0hOYGWJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgylqAIfjuP1VBCLkWt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyzdXe2ei5y3WFqQr54AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzcOHnHAlwAXAV0gbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]