Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If youre tracing over AI art i think thats just a double layer of crap bc AI art…
ytc_Ugy36B7Fp…
G
As we all sit and watch this in real time. Why are humans so complacent? I mean …
ytc_UgzWkSK0J…
G
In The Orville, the Ai-bots almost destroy Earth. After they killed the 9 billio…
ytc_UgzizXDO5…
G
The one segment called "how will AI cure diseases" should have been "how will AI…
ytc_Ugyot18IM…
G
This is why I hate it when people say that AI is like every other disruptive tec…
ytc_UgwvVbtVh…
G
Any take on AI from a CEO should be immediately dismissed. This fool has no clue…
ytc_UgwTCIjHR…
G
That makes sense in Chicago look at the statistics... Non sentient AI takes info…
ytc_Ugzl7gCtg…
G
https://chromewebstore.google.com/detail/hide-google-ai-overviews/neibhohkbmfjni…
rdc_n8k5c28
Comment
I have always said that AI is basically an animal. We can train it to do things we want, but we don't necessarily understand how they think. That's amplified on a new level with AI as they don't have organic brains.
In Avengers Age of Ultron (Ultron was iron man's rogue ai), Ultron was tasked with creating "peace" but since he had access to the entire Internet, he saw that we humans were creating that conflict. He did what was asked, but the task wasn't specific enough.
youtube
AI Moral Status
2025-12-13T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy3Nsx3wS4QbUGJV_V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7ZwvTw8q1zZJAAO14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQfogh3AGFmEwCSGV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzU0fV1NeUvr-61sot4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx4xQSNteWbWns79CV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwY9zb_WhLwERi30yd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwzu0QsOm94rOruY9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzo025LHan_gI4yES94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7SrVTIBAWrTO06_d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugwky67vzo41qcXNS514AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]