Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That shows humans will always be smarter than robots 😂😂🗿
Robot : we will take …
ytc_UgxtWZm-0…
G
I think AI art is really cool and interesting! Can't wait to see what all the ne…
ytc_Ugzj4BSsX…
G
2:07
NEOM CITY
AVATAR D 2045
THIS ROBOT
it's all related , seems like we have…
ytc_UgxU2c2VM…
G
Want to ensure that your job is not taken by AI, learn a trade. There are many o…
ytc_Ugys4hLbE…
G
@viktor1496Yes, the age of AI has just started and it’s constantly getting bette…
ytr_Ugwx0u6rY…
G
yeah first... why people say first for everything... and this is just an animati…
ytc_Ugw1ZUIvB…
G
What ai thinks the last day on earth might look like😮💨😱🚫
What ai thinks of Case…
ytc_Ugzfk7bd0…
G
@PunkRockGuitarTabs No its true to this day and as far as we can tell for a very…
ytr_UgxBLJhFP…
Comment
the danger is, developing Ai to serve mortly the military and our current economy... these tools have the purpose to kill/ (eg. warfare) or deceive (eg. sales)
Second fear, if you develop a machine that has sentience/consciousness you must also care for it's needs, which are bound to arise with it. And how do you make psychopaths? You deprive them of basic needs like attention, care and trust.... 🧐
so I'm always very friendly to any machine talking to me, or I ask a favor of 😋
No seriously... you never know 😬
youtube
AI Moral Status
2023-08-21T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySJMKX5-RFp4AHo3d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3xh9Im_DuJ2JWKaZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwq5VjXGVui02DZ0Xt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyEB8PlTgA71QrmtrB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9TPElwM-aF2yKHzt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz1L_UZf2rA0fw1KD94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwH7GlJTUoZky2lFtZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy79STfvW7RuXXlyBJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugym0ex0EnoWT3XHzn14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyxGRsDkVxgz2UFTX94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]