Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This Person scares me, on the one hand he is talking about, we can't stop it, on…
ytc_UgyNH6e0z…
G
Right, I feel like there has been a creeping trend to remove the bottom rungs of…
rdc_lm0prml
G
It doesnt work like that.
If it does end up happening that AI will take over mo…
ytr_Ugz6P_Uy0…
G
So wait we are the most intelligent beings on earth yet we want to make/have mad…
ytc_UgxeRMY7t…
G
@angelsegarra1135 . That statement is true. But I don't see that applying to a …
ytr_UgwcHqgDZ…
G
@GoogleVideoMan You think that's great? Think of surveillance footage implicatin…
ytr_Ugz0qMRIS…
G
Humanity is overrated .if we are stupid enough to create a AI that will wipe us …
ytc_UgwDeGlpP…
G
2:04 in 40 years a robot is about to nuke every country in the world saying: 'Ok…
ytc_UgxtNpbxg…
Comment
even seeing AI just as a multiplier for a single humans power to influence the world, this will lead to single humans using the AI for bad things. the multiplier will be higher with higher intelligence of the AI. no need for the AI itself to have bad thoughts or deciding the fate of humanity
youtube
AI Moral Status
2025-11-02T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyjZyTJQdV33bw0vop4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwCMEtyTtZwynwkXrV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxh3riF0-4UK4etQ0d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugw7UPSqMIu1xFiIUSl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzrp8HbL5oyccS7tDh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyJZ5WYBWtWhye6KXN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7_T-EMPRxzTRgF_N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy2ds2xE56wcAnbRrZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwRM1UtUh06iVVjG654AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw22W8hz_3dOr8fC7h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]