Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well we actually do have some A.I. that has, technically, passes the Turing Test…
ytc_Ugjmo31Jk…
G
The eyes are often hard to read, they have improved on the hands but can still b…
ytc_Ugwp-ZVcB…
G
I think we need to stop using human terms to describe what ai does. No, ai does …
ytc_UgzIz6Q7x…
G
It’s not. China would love to know how much power Americans consume so they coul…
ytr_UgyzjNFe_…
G
@BrendanDell No, I don't think LLMs are going to get us to AGI, what I do think …
ytr_UgxBmPS9e…
G
AI Guru: AI could be an existential threat to humanity. Us: So are you going to …
ytc_Ugzxy1_C3…
G
Musk warning against AI and at the same time developing it .
We're in trouble.
…
ytc_Ugxpj6MJG…
G
simple, make a most favorable employee rule, to tax and cap the profit of pure A…
ytc_Ugw4g2d8i…
Comment
Could we use AI to make ourselves better than AI? But then AI wouldn’t want to render itself useless… Would we then give the AI incentive to enhance humans if it insured its own survival? Are we going to meld humans with AI so as to keep humans relevant? We could become one with AI and then it would be a win win situation right?
youtube
AI Moral Status
2026-04-14T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx5OnZa-cOzNQd4W9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwE16oMSt2kP5xk-yx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyDfiu76bJDjgsY8yJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnA6he2Se3olS1Y0B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwUIYGzi0H4-cDj6rB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxggAXl5fT5ynj0a_54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxvg52CT7jUuTbKtkp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx0cXzadS64JCVy_7B4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzS7LYb1R7VfBCs4DN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZOu2zBAUxO1ZgVtN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]