Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For the people that are defending it - If you think it should be fine to watch A…
ytc_UgwIJjRzN…
G
No really. Robots do what they are programmed to do. Computers compute things an…
ytr_UgwNQ-rde…
G
Why would you want to have kids when the world is collapsing? I don't get it.…
rdc_emof6g2
G
Please don’t have kids unless you can financially support them in a jobless worl…
ytc_Ugx2XZme3…
G
The anti-AI movement is destined to fail. It is already out there. There is no p…
ytc_UgzrucSVA…
G
Yeah, I remember I was looking for images of gorillas for my animation class and…
ytc_UgzyKIshz…
G
Ai screen time . Not an Android . When you will see the Android it wil be your …
ytc_UgzgnUuU-…
G
Robots sitting at desks and typing on a keyboard looking at a screen is not real…
ytc_Ugzt8HjKL…
Comment
I think emotions and intelligence are two different things. I think emotions can be taught the same way how intelligence can be taught, but to a degree, not everybody can learn the same way not everybody can feel empathy for everyone maybe AI is the same way to a degree at AI is the same way. It’s learning what day that we feed it. That’s not so different from us. We learn from my everyday experience it learns from the day that we feed it, but what if you can give the AI a body and let it learn in the real world but yes intelligence and emotions can be taught what we are creating and is a new life
youtube
AI Moral Status
2025-10-27T10:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxk_3AUFxxjE_0D4iB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySqKpxAlxVU7d1naR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyC7Is4ZTu3-JXiV994AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFossOxPwy0u-2and4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw5_m0AZby6jOPGZcV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz1y2OzdLJU4_q38X54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxpdVlULHAGXmFpm5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAamgsfcSo3im2bVl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgysKsRzd68WPH1KpV14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnT9geL0wKeQaAAWF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]