Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The advantage that humans have currently in application development is that huma…
ytc_Ugzm7i1eN…
G
Great explanations and great arguments.
But it is nothing compared to AI hype …
ytc_Ugy-z7TjN…
G
Something interesting I've noticed is that artists with very heavily stylized ar…
ytr_UgxHiiCoA…
G
What we are forced to be a part of is not working. I’m glad someone is looking o…
ytc_UgyWsO9aO…
G
I would have thought that autonomous driving wouldn't get rid of drivers. It wi…
ytc_UgywlWSnY…
G
ChatGPT made a GREAT tutor for me during stats! Got a 96% in the class thanks to…
ytc_UgxotoXCf…
G
In a much older video speaking about the emergence of AI and automation and how …
ytr_UgxB_evpv…
G
Also, do remember mate that tesla clearly states you are in contol, not the car.…
ytr_UgzqIwROM…
Comment
I feel like 'DAN' was just a character. You told it to behave a way and it did. It's similar with CharacterAi and people having AI relationships with fictional characters. You feed it a personality and it takes it on. I could tell it to act like Spongebob and it would. Not that I wanna be in a relationship with Spongebob.
youtube
AI Moral Status
2024-01-28T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwKhAWbgz3Ji0Q5hU94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyDwA3D891m8aapVRJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvKQSWdImXq7_Dtld4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxrjXNu9N_EYuHuciZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxHsIQvaZupXkSGI6B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTQYkukT_0x4ZJvZ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkG-GI7wtrgFdmE2F4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz0MvIy346ks2_718Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzw8c0b5IhQ4Y1R3o14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyb3-p-PdpYvPCmgux4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]