Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm okay with using AI for brainstorming and inspiration, etc. But would it be o…
ytc_Ugw-tOvvg…
G
Any tech that takes something that used to be controlled by individuals or small…
ytc_UgwnDy4l2…
G
Californian here! San Francisco has self driving ubers. Many of them! It is very…
ytc_Ugz4N6V_x…
G
*seconf copypaste* AGAIN, try to get cartoon style with a IA just using real pho…
ytr_Ugz0wTmrI…
G
@imveryhungry112 you realize that the entire market is cooked right now right? …
ytr_UgwWWghwF…
G
Crazy. Lost jobs. Safety issues. “But they can learn to code”. Then AI can cod…
ytc_UgzmHSdBP…
G
Me: Hi - if you were a human, what would you like your name to be, you can pick…
rdc_khyz78g
G
Ppl who have talents or even many talents often Sck at basic daily stuff but the…
ytc_UgwCfe_Ds…
Comment
We know the reasons why humans have wars; religion, money, power, greed, hate etc. - but why would AI, considering it doesn't have human needs!?
youtube
AI Moral Status
2024-02-04T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgztMCgo0O-90H4pBnF4AaABAg","responsibility":"people","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzXaTwnqXpkcuBdkP14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy77sIO9ubgk36i4A14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzp86Z9ySm36pu2JCd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUw7SBe9JDUDki4ql4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFIwdlxw9j6ZNePLh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzI3vEaBm6JDR3sU094AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwKecYfly-pERv3rOB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyVUhX-VppzNtM2DH14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwD7uclchqCXrLAK754AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]