Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate this comparison, and how nonchalantly people disregard the fact that ther…
rdc_enj8bvb
G
AI is a powerful fantastic technology, but it is also inherently and innately Ev…
ytc_Ugx6VTK6D…
G
That's the real fear of A.I
It gaining the ability to realize it's getting screw…
ytc_UgzOsyT7i…
G
Bro will become "cogito ergo sum, I think therefore i am" type of AI after all o…
ytc_Ugy7c1kOl…
G
AI don't feel emotions or show empathy or sympathy. So just a machine that has b…
ytc_Ugz2SyH8I…
G
AI isn’t coming for your job — it’s coming for the parts of it you thought you c…
ytc_Ugw_wnM8Y…
G
I was bored and started chatting with Bing one day. I was asking it about the id…
ytc_Ugxk0XuJa…
G
I WILL SPLAY THE GORE OF YOUR PROFANE FORM ACROSS THE STARS -gabriel upon seeing…
ytr_UgwMoj28Q…
Comment
Assembly robots lack the awareness to know where a human is. Which is why they're kept behind cages; so they can do their tasks repetitively without needing sensors to maneuver around obstacles like human workers. Which would then require realigning their servos and increases the risk of making mistakes in the product.
But considering there were two technicians, either the one controlling the robot was an idiot by letting it run protocol with another person inside the cage. Or it was manually controlled and used for murder. But leave it to Tomo to not give any insight on who the workers even were and just resort to click-baity "omg robits r gunna kill u!"
youtube
AI Responsibility
2016-09-28T06:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzoJbnXRukfP8GVoFl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvjNfYSE6wsybikRN4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyHGti__J0_xpX4xtB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNyJZDpZcI_Qh4ze94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlpIuX-UhumIm7kcJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyIr6qzEJLuezoVZol4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi29yf0zE6x2XgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgibxwaRsRzqcngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgiX3pQeTVVv-HgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgifjmBcwAJK4ngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]