Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Haha because it uses the guidelines set by your guiding principles? Ppl are fun…
ytc_Ugy86Y_9C…
G
This is so insulting. There is no such thing as thinking machines. There is no…
ytc_UgxGDy1tY…
G
I agree with your conclusion, but for a different reason. Biological life is as …
ytr_UgzGqNeYT…
G
I have to talk to a chatbot to apply for a job now?
BURN IT ALL DOWN.…
rdc_n0safju
G
Aha,, ha,, ha,, ha, look at the feet of the robot it's not touching the ground.…
ytc_UgzGXkygl…
G
*OK... I HAVE BEEN SAYING THIS FOR A LONG TIME TO MY FAMILY AND FRIENDS... HUMAN…
ytc_Ugy_X3EKV…
G
*China’s per capita emissions in 2019 also reached 10.1 tons, nearly tripling ov…
rdc_gx6xu14
G
The monster of AI is the energy expenditure, its economical impact, the fact tha…
ytc_UgwSSBWsx…
Comment
If a robot becomes self aware, and was close to human, I personally think we should give it rights. But we still need to be careful, because if we get to that point, we need to acknowledge the fact that the AI will probably be much stronger and smarter than us.
youtube
AI Moral Status
2020-12-01T18:1…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx-nFWyiUy9ouWq90l4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxxFgWyTVsMvWaYufB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy8MhKdh7yhSELlMgV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwaycaKcv0pFQz5JVl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZGfM2kCpfpzs2t3J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyhXbB9hO43-9wA0ed4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaDRqykz1UJgDXCqV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwZHW5NyvTqzkFlbPR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy1aJjcOly2eQPKXat4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyxmnkfgEH85wywmol4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}
]