Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is really disturbing and disgusting. Thank you for bringing awareness... Ta…
ytc_UgxhxPd1K…
G
@setin8720i think sometimes it isn’t a mashup tho. like it is, but there’s lots …
ytr_UgxEHshNG…
G
But unlike Teslas in self-driving mode, they didn't crash and burst into flame, …
ytc_UgwH9nQqU…
G
What’s funny is ai is likely most effective from the top/ceo because it’s actual…
ytc_UgyEh-O8z…
G
At the same time you ask me a paper with 1000+ words and i can tell everything i…
ytc_UgwNVD866…
G
This robot had the chamce to take over the world with that gun in his hands😂…
ytc_Ugz1NHgXw…
G
Progress is rarely ethical. This is going to be embarrassing in a few hundred ye…
ytc_UgysodhFn…
G
This sounds like a sad death cry of someone who is unwilling to accept the end i…
ytc_Ugyr8XJFc…
Comment
We should ban people from programming emotion-feling robots.
A robot is a machine which is programmed to do a job, it must follow its creator/owner's intentions.
If a robot has feelings it should be reset or disposed of and whoever programmed it to feel should be punished in some way.
Robots do not have feelings, their feelings are merely the mechanical process of collecting info and elaborating a response, a skill WE HUMANS gave them, it's not emotion, it's a process.
youtube
AI Moral Status
2018-07-02T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx7YznFYEUKkMe1iBd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzahW5WKawAqoKCB7t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRHWKvJT8IhKO-_qF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbgNKJMW57e2gSy1B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzYCJpRzmrEA7SN_ll4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7dI6ViiYSCEbnzft4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzinrD6hweefSHzu-x4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy9MR1jF5P4ZT51IHR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy75Vkh-6d8zWFeqFZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnZ11_1Tt2abQ2lgh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"})