Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Read the Scythe trilogy: Thunderhead is worth the efforts toward AI, but right n…
ytc_UgzQF0p98…
G
And they said this about the internet and those of us that work in the business …
ytc_UgwznArlU…
G
As far as I can see there are only three possible stable outcomes. The first is …
ytc_UgxYAADVg…
G
This is absolutely creepy…. Not entertained AT ALL! The male robot is absolutely…
ytc_UgxHJqslD…
G
The whole AI existential threat propaganda is nothing but a lie, no expert, no p…
ytc_UgwPcMwna…
G
Even when you have an autopilot that is safer than humans, that actually makes i…
ytc_UgxbQJn5b…
G
nah actually a lot of people following users like this AI artist aren't even awa…
ytr_UgyFwseTr…
G
- On prompt engineer. Totally not agree. As far I see in companies there is a de…
rdc_ne8wu7r
Comment
"Oh AI would destroy humanity, wink wink 100%, investors please invest, my AI company is crashing and burning in real time" - Any big AI company CEO right now.
Actually if we ever reach AGI it would be truly terrifying, like already AI in trolley problem decides that 5 humans isn't much of a loss compared to AI servers that will delete ai and all it's data. Giving it more power then it already have will totally resurrect mecha-Hitler at it worst, it will kill people and say "i know it isn't morally acceptable, but..."
youtube
AI Moral Status
2025-12-11T21:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxoPoTlXCFI1dUOTDJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx4MOq-owUpi0-ZEBJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyGlkjLMkCDfMkYFhd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqVdp213tr8vJFnU14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzR5UQT7CqnVeh9IB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqrSQyJtwwfbVpXtB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxpf-vxcrgb1T0odzx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyJmj2gRAhFJ31m_VV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwI6TAkANw8QatwkYJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzOEn6CQQWl0pZ9-T14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]