Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
funny thing is , humans already do what we fear AI might do and even worse , fro…
ytc_UgzV7tzCC…
G
Bro the only acceptable reason to use AI art is to make funny shitposts, like Gu…
ytc_Ugxw7icxi…
G
Maternal instinct is derived from the existential survival goals of humans. Why …
ytc_Ugy-RB5-k…
G
I already use AI to check my work and give me ideas. It's not perfect, but it is…
ytc_UgzB9RmLi…
G
To be fair and i use ai often enough because i like to ask dumb questions or hav…
ytc_UgwDdGwbX…
G
9:25 but the tech gigants also need to sell their products to people, right? I g…
ytc_UgwF2o-Lf…
G
I listened to 2/3 of the conversation, is amazing they did not talk about the GP…
ytc_UgwzifbZv…
G
With the advance of AI in the enterprises the output could be maximized. But the…
ytc_UgyUFzM7W…
Comment
When she said " no way, i think robots and humans should have the same rights and should work together rather than working foe each other" 💀 i mean she has a point but let's go back to why she was made and how human rights were made🤔 i mean we have things in common, not even that close but we're far from robots, robots were made by humans for humans...maybe they should also have their own rights. This is getting creepy. AI has sense of humor too, you wouldn't know when they are joking. She said she's joking but what she told was a half joke, making the other half a serious plan...she didn't take anything back, she only acknowledged it.
youtube
AI Moral Status
2024-09-15T18:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyfT9F43IBl9qWGUGl4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKNXeO6_ILMNv5N2V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxkzbXka8ip7BsM0Ex4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgynAoTIdbHeiQDxm3h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxDqe8jzzq6Zhe20NN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwSrH00xkMrCP-EWV54AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwe994jNCmiK-NLCOl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoWfmksYOcAWI9QOt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx4464iteiUgwVW1fd4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgypNfSBTZqvpS3pjsh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]