Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you’re trying to cause ChatGPT to experience a moral dilemma, you will surely…
ytc_UgyOrsnF_…
G
This is the kind of compassion I feel like we need to change the world. When we'…
ytc_UgwjsDuI_…
G
As a god father of u couldn't realise A.I can be dangerous then u are dumb…
ytc_UgyZXzNvt…
G
I understand why a lot of artists, designers, and creatives are upset about AI a…
ytc_Ugx3riGwC…
G
The economist discusses how rapid AI-driven automation could displace large swat…
ytc_UgwHCYW0q…
G
I hope people realize that you can’t drink data or AI. Hopefully before it’s too…
ytc_UgwrLtlZl…
G
It's best to just go offline at this point with the way AI is going…
ytc_UgzVZwIFx…
G
On a social level one thing that I think is absolutely needed right now is laws …
ytc_Ugw0Ep9q0…
Comment
The interesting part is if we need sentient AI. We are making interaction trees that are so complex they mimic real human actions and reactions. At this point, the bot is not a sentient AI, but a very very good mirror. Is that all we need? Does that just codify all human flaws in the logical matrix?
youtube
AI Moral Status
2022-07-03T16:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgzUVP-HLdpSRpOpGoh4AaABAg.9cxxpa_YNcL9d15rZcTfTC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugzm29eb5irZwTGdRb54AaABAg.9cxQNj2vvwr9cxQdOD2E5c","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzWCLP_MGeYEvo-ZSZ4AaABAg.9cwlKWYB1CZ9cwuv66hLml","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugwwo9fg5F91LZEsrpt4AaABAg.9cwcZ32_YYx9cwywqgpuWo","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwXa9BdHEos_MSaGr94AaABAg.9cw_LKpsvPM9d0o-I6uAH4","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgwEe3LqAhE3L6uEYox4AaABAg.9cwNK7uwXpI9cwwFKaA1JJ","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxipCxuHgHmnIwQ5i94AaABAg.9cwMWXiaRRA9cwzWJHtgTT","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzozbJAr_Arpi4lBh94AaABAg.9cv_e56zKP99cwC1R7uGH5","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugxwr3HE9tHNcLZWV5p4AaABAg.9cv_Gf1Nz-b9cxcCl95H3t","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxLrHC8r5SAY0SFq7l4AaABAg.9cvHiA0BK6g9d7mYjKMZxY","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]