Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The idea if autonomous weapons is horrific, I know enough about vision systems a…
ytc_UgwDrP_0u…
G
@CoryClarkPhotography sounds like a fun job! 😀 Are these updates will be applied…
ytr_Ugzm9P9Zx…
G
sophia isnt a true ai it was a pre recorded words and facial and hand movements…
ytc_Ugzb1qO4T…
G
Why make them look like a poster girl? Men take women's beauty and make it look …
ytc_UgzG_Ozh3…
G
In today's digital era, ensuring security and efficiency is critical across vari…
ytc_UgyxFo0UY…
G
How is AI going to "take control"? Until someone can sufficiently answer that qu…
ytc_UgxhF7R_n…
G
The way he decribed that was like we are talking to an animal that can speak our…
ytc_UgzrLFh4Z…
G
As a voice actor & editor... AI is slowly replacing us. Even though the Art & Wr…
ytc_UgzCq0S2t…
Comment
If Robots developed Consciousness then goody goody. If the robot is worthy for the same rights as Humans have great. The question to ask yourself is: Why was the Robot built in the First Place. Robots are tools for Humans to use first and foremost. We as Humans are not going to let a Robot out maneuver us in any way. To do so will invite the destruction of Humanity. There has to be a Symbiotic Relationship with the future Robots that will be as smart as Us by 2050. I know Society like to divide and conquer but I plead to you my fellow Humans to find this middle ground.
youtube
AI Moral Status
2021-01-31T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyc7iVFUnlvOXBGU6l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7HbBXeenVQQCcub94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeA7-O_aOfbcsmgP14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRs2CZ8ylrG3el-SB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzV4RRzZTZ5CQnumu54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwbt_rTc6HqdFD4FDx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyNLABXwwMCSejWi-94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyi6JkMBrfnY-LLLa14AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxLDWG0S8en94uHLQZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw85IDHzW72BMbuInx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]