Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artist getting inspiration from other artist: Fine.
Artist when AI get inspira…
ytc_Ugw6vifel…
G
For me AI is too perfect it looks fake, Human art with all it's imperfections is…
ytc_UgzpBglCY…
G
The thing is, the AI you”ll use today is the worst AI there is. It might not be …
ytc_Ugz54guo-…
G
What will be the purpose of AI?
Our motivation are very clear, food, sex, wealt…
ytc_Ugz8QozSQ…
G
I think AI is the biggest thing most people are uneducated about. AI alongside r…
ytc_Ugyb0FaUT…
G
How is it ChatGPT’s fault? Or the fault of OpenAI’s founders?
Your son didn’t fe…
ytc_Ugx81yxSK…
G
It was over when they chose the kill the cat. Ai still has much to learn.…
ytc_UgxqYvQYB…
G
I love Google but WAYMO is WAYMO expensive and geofenced - still good though but…
ytr_UgyhRbTAe…
Comment
Asimov Laws: aka 3 laws of robotics
1 A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2 A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3 A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
If a robot producer cannot absolutely guarantee that these laws are 100% built in the very core of his robots,
the law enforcement institutions should prohibit him creating robots.
This is the interest of the entire human race.
I understand that the development of this robot is in a premature phase, but what we heard here is very scaring,
and it gives a very bad reputation for the producer.
Unfortunately human nature is a factor here, so we may prepare to see robots actually made to destroy humans.
youtube
AI Moral Status
2016-05-03T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgiWMHgExCirhXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugj49_6vw7yjgngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghCAFOXmiIjeHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"unclear"},
{"id":"ytc_UgjJxkMDZZq-_XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugiaitc4_ZrS33gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugheq5RGy67-sXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UggcfE3mEG2EbXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghhIAZwlw_3AXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugg6JbEau3pZD3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"unclear"},
{"id":"ytc_UgiwagGOMnP-vXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]