Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"lightening the load of animators". More like, we will fire 90% of our animators…
ytc_Ugwbn6UTG…
G
AI does not exist. A machine learning model will never be able to think, so it w…
ytc_UgwDyBwOF…
G
Sooner or later they will update this robot with a modern LLM, like everything e…
ytc_UgwyNBe85…
G
one day the US president will also be an artificial intelligence entity. mark my…
ytc_UgjpHbD1c…
G
Actually, AI is already being used. AI is in yhe expansion phase now. There are …
ytc_UggRJIcZV…
G
Talk to a friend today when talking about movies mentioned the writers/actor str…
ytc_UgxDzVo_u…
G
Because "AI" is often just a short-hand for using huge amounts of computational …
rdc_fakmnp3
G
I don’t understand why people fear ai doing this stuff, people would do the same…
ytc_UgxK-Og8i…
Comment
Super intelligence is oversold; it’s not even conclusive it’s possible. However, what is overlooked is that you don’t need AGI to destroy the world — just sufficiently efficient AI that is poorly or malignantly aligned. When the next Kazcynski decides to weaponize AI to take down the banking and finance industry, the world as we know it is effectively over. We are much closer to that reality than to super intelligence.
youtube
AI Moral Status
2025-12-07T19:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyuk1hBtKCsoVIMlGV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxCMM_nHx7vx3CUi4B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVveglUuOdEOPDz0Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxX7TxDaYQ34a1_0RB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxtOHpYkiOjd13ruUR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzysJ0DzXsAajmE7B54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwAYhjcm3oJXCmDcaR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqlRre80WFcsF3yyF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzNGyLesXo-3GaVwj94AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRHE4F_qhgUGRtXdJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]