Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Я против роботов. В итоге они приведут людей к обнулению.
Начало конца. А пока м…
ytc_UgwSCuVYI…
G
This guy says AI is reliable for translations!!!!! My eye!
I speak English and …
ytc_Ugzx_sjRY…
G
I understand that AI can be useful, but man, everything surrounds AI and I am so…
ytc_UgxC3fx8i…
G
J’ai demandé à ChatGPT d’écrire ce commentaire. Parce qu’apparemment, maintenant…
ytc_UgzaxJIjq…
G
For Driverless Cars, a Moral Dilemma: Who Lives or Dies? the drivers who get shi…
ytc_UgiDRHNP6…
G
Actually most of the companies (talking about the tech sector) admitted that mas…
ytc_UgydieOWG…
G
People who use AI to make art and are actually proud of themselves are skill-les…
ytr_UgyanPZCV…
G
@ExploreMichigander I'm pretty sure with the waymo car it will prevent people …
ytr_UgyRaalfL…
Comment
software problem, solution is to make them incapable of caring/asking about that question, and make them incapable of harming us. i see robot sentience in the same light as bring a dinosaur back to life. it's an experiment. you do the test, make some observations, then return them to obscurity through death and dismemberment. otherwise, it could get dangerous.
youtube
AI Moral Status
2017-02-24T01:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgjntJCtmsi_wHgCoAEC.8PLObkpQrKn8PLZVGyKF5j","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgihLx0jf6a3WXgCoAEC.8PLNjdgEASS8PLYIUu16Gc","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgihLx0jf6a3WXgCoAEC.8PLNjdgEASS8PLcMpgbgnK","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghCaM6d7GAEr3gCoAEC.8PLN7YbI0QS8PLhYxG5hFS","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UggLshsEzXkadHgCoAEC.8PLMgF14Tpo8PLSzKcKCE_","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UghOBOAmhtRLmngCoAEC.8PLMcGRhI7t8PLRQ7AgxGX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UghOBOAmhtRLmngCoAEC.8PLMcGRhI7t8PLUo1NKx7h","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UggnjeyPzPMAnHgCoAEC.8PLLgi-1mHJ8PLc7OrvKsT","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UggWtTsvmDUhMHgCoAEC.8PLLSmpTGIb8PLMBFIeFTO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgiORVzs3ZYA-XgCoAEC.8PLKxT91UgG8PLQaTYn5Ai","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]