Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro don‘t hate the ai they just do what they are supposed to do hate the one tha…
ytc_UgzLyD73Z…
G
AI is LITERALLY NOT EMPATHETIC. That implies the ability to FEEL and AI DOES NOT…
ytc_UgyKB11R8…
G
AI looks like a landfill compared to digital, i mean digital is basically easier…
ytc_Ugyn10Emf…
G
Not sure I fully agree. In top 15 we have the U.S, Japan, India, Brazil, Canada,…
rdc_emprz95
G
No one talking about how they want to control the worlds population with this te…
ytc_Ugym4qjEU…
G
I was thinking "oh shes pretty" right before they took her face off, and then I …
ytc_UgzqWgXZG…
G
I’m just an everyday man and when you say most people have only been looking at …
ytc_UgyOkZ-8g…
G
AI has a long way to go before we need to do something as it is still in the chi…
ytc_UgwT7RrXD…
Comment
Don't give Robots emotions . Emotions can play both a positive and negative impact on their thinking affecting their decisions because if you give robot emotion like love he may love one person more and second person less he may do more things for one whom he love and less things for unloving person so they are just acting exactly same like humans . Don't give them any emotions .
youtube
AI Moral Status
2021-04-27T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzBSdO-QmX3hTsLH3Z4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwEh7ZIvAetcHvKD4d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwdwpJBv13tHIU0Dvp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuPM_DWuMRhuZKi414AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxnG919xDfwG7AlM5l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxaWKQoOiXKnQpxCWp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy4DgMzv3LNXzIwZUF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxHVnpkiK1tJBAo44d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyL8-IaW13dE8l_Eit4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw-pVhQwn8HCJLYEmt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]