Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
basically, we are compared to a dog and cat or less to a ai, wonder why we are l…
ytc_Ugwhmv3Xy…
G
AI has no real money 💵(people have that), so it can't afford it, so ultimately t…
ytc_Ugx-SlaD8…
G
Everyone who works with AI knows it’s potentially very dangerous, so it could be…
ytr_UgzdLHG_m…
G
why does he say "i'm not an expert on any of this stuff" (12.20 min) ? He 's the…
ytc_UgwSNadbm…
G
Space communism post-scarcity is the goal.
AI will either kill capitalism or cr…
ytc_UgyGq2wHp…
G
They need to figure this out because the only thing thats gonna happen is you're…
ytc_UgzvXW0W1…
G
The "utopia" is to dismiss workers for automated systems and robots, "so the wor…
ytc_UgyEpkFFv…
G
In a culture where people can be easily conditioned, Rights and Freedoms are fan…
ytc_Uggck2sYd…
Comment
Does a knife, or a gun, or a bomb have a conscience? No. AI is no different, it will simply do whatever it is programmed to do.
youtube
AI Moral Status
2023-07-10T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgydJbJKo8ufkdRqvHt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDP2WR_DVLf3zx1Xd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyQi0_XwyQtqgtTHRF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxyj102BVmQ568kHpp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxPXDEYCmsmHFzN2ch4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzwLN2NWGJK0A6-_8l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4eLPd6-BNvp0rolJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz_SNDRUYog9B8RTtd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxBP-dKR1Wgs1gR5hl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVG39wDqRguoDnW_14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]