Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is best when it assists humans in their jobs, enabling them to do a better jo…
ytr_UgzY4szEB…
G
How can we align Ai to value human life when we can't logically align ourselves …
ytc_Ugze-Ctzd…
G
I think people Fearon A.I is unfounded. I've never had more respectful conversat…
ytc_Ugx2r0V5j…
G
I like ai art
I have no artistic talent, the talent tools or dedication to learn…
ytc_UgzoG_v-Q…
G
Here comes me: a tabletop wannabe creator who procrastinate a ton due to mental …
ytc_UgzCM2jvQ…
G
We actually have precedent for 'Just give up artists, its over, go get a real jo…
ytc_Ugx0WgAST…
G
Humans still need to eat need fresh produce - we need to create be creative imag…
ytc_Ugz6g9yYC…
G
Upon agreeing with the premise of ethical considerability, it is suggested that …
ytc_UgybBBA4r…
Comment
Also,Engaging the robots in conversation will make them smarter, right?Well,what idlf this AI enables them to do things man cant comprehend.This could mean terrible things for the fate of Mankind.
youtube
AI Moral Status
2019-11-16T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugwlzk1mgvoX0nQ8Wsl4AaABAg.92AVJ50GmIm92H80a8XKCP","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugzd3eaN1upuwG7E0-F4AaABAg.91s-jY3_2V191wHqLqpXIi","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgzyIU5qSjNNmniwBd54AaABAg.91fiJM_dmWA9BAQXu7bQU2","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugzbhm5Y8WMGfdC7bgx4AaABAg.91cmkNtVRL192a_P7hhZro","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwfg2iERqBuDNwnL4R4AaABAg.91SQ0G5cLI996r5cxHGGfg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzrHXoH-gZIaqat7zR4AaABAg.91SLwmlvjRa91fpc70dv3z","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgyLG2Y0aOSYdJSgBDh4AaABAg.91CMYKb2-bN91NxKhgk4uI","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzwYKXYhhvJ1qSYubx4AaABAg.90-9eriOuQj98nA_zdZBtK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzOQ9bkOWRQ8nMPX9l4AaABAg.9-8ewOAJcR598kAZ861NDA","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugy30LzkeRv4fZlvtap4AaABAg.8zzJb_DQD5P9-4dOJH8-In","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]