Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Structural unsound for a few CEO/elites to control tech companies, but that is w…
ytc_UgxHQsh4K…
G
Artificial intelligence is the future, not only for Russia, but for all humankin…
ytc_UgyjSDeWz…
G
>AI looks good on paper to executives,
This right here is the crux of all of…
rdc_l9x28g1
G
Some of those responses don't seem like real responses, they seem like common th…
ytc_UgxWABXiX…
G
I been trying to study Samuel Altman very closely and there is something off abo…
ytc_Ugz4r0xUv…
G
not because of AI, it's an excuse to fire high salaries and contract cheap immig…
ytc_Ugzi6nNke…
G
Hahaha that AI is definitely smarter than you. May not be sentient but can sure …
ytc_UgyCj1otY…
G
Dissenting opinion: generating AI images, even without any intent to make profit…
ytc_Ugy_aoFpV…
Comment
Let's just refrain from making these robots so we don't face the problem of them being conscious and uprising when they don't have rights OR having rights and becoming more powerful than us (since they can outlive us) ... I also don't understand why so many people are pro-robo rights.... they're consciousness is artificial. They would be replicating emotions we introduce to them... you can remove consciousness from a robot and it will still work it's functions... you can't REMOVE it from a human. And unconsciousness affects our functions...
youtube
AI Moral Status
2017-02-25T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghKoK55MKjPi3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjLk4dwj6E7c3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgifpkGSnco6Q3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh3oGA9UWKfbXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughlf5QYg265ZngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugjui8lyYzSrvHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiuXt-nUv5jbngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggvBcByL6n803gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjWed3DMfpEnHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghAqkiQfyzw4HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]