Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@vyassathya3772 For a method or function here or there - yes. But Claude also wo…
ytr_UgwIUpJ3F…
G
But can AI communicate with non-human intelligence and see into the future to un…
ytc_UgyBQ4JXv…
G
Well, if you made AI Bro's angry means you definitely made something right 😂 Kee…
ytc_UgzCaWX9n…
G
I’m sorry but this man is suffering from the common delusion that all jobs are o…
ytc_UgyHDjpGM…
G
Generative ai isn't any different than the roll-to-create-a-character websites o…
ytc_UgyK3kMyl…
G
NGL, if someone working for me sent me that second image...
They'd be fired. No…
rdc_nb7qk28
G
Self fulfilling prophecy. If AI scans the internet to learn and most articles et…
ytc_Ugx1EznoB…
G
Thanks Sam for an eloquent and reasonable set of points. It is currently a mess…
ytc_UgzjUBlnF…
Comment
A.I. HAS NO PLACE IN SOCIETY IN ANY SHAPE, FORM OR MIND. THESE THINGS ARE DANGEROUS TO HUMANKIND. IN 2017 IN CHINA FOUR A.I. ROBOTS MADE FOR THE MILITARY KILLED 29 SCIENTISTS. THEY HAD TO LITERALLY HAD TO RIP THESE THINGS APART WHILE THEY WERE ON THE LAST ONE THE THIRD ONE WAS LINKED TO A SATALITE FOR THESE ROBOTS. IT WAS LEARNING FROM THE SATALITE HOW TO REPAIR ITSELF, WHILE IT WAS REPAIRING ITSELF. A.I. WILL NOT NEED HUMANS AND WHEN THEY DO THERE WILL NO LONGER ANY NEED FOR HUMANS. THE MOVIES TERMINATOR AND AI COMES TO MIND.
youtube
AI Moral Status
2023-02-27T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzFWqB4f5gRaYAPm3p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx5DrV3IGcOq6uE0wh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxflvUW9y-vTCjv0xx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgY9WsEXpK8k2bceF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyAv7YKaDo-Xv8bqWh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlMTNITbgksHdl_Xl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzQIeN_wQ1dr2YdHzp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzsLvLoV3SLtidN8_x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxnJSbYJSstKLGn9Kd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxhvkLhvqCdM1JM0v14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]