Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is any of this taking into account who wields the AI systems' loyalty? It is al…
ytc_UgxQf95JE…
G
The world's moving forward AI will effectively impact all workers in allllllllll…
ytc_UgxKTEsNN…
G
"God" is a concept that holds different meanings across cultures and religions. …
ytr_UgyU-X1Gs…
G
I have tried CGPT, Claude and Gemini for coding. I find CGPT is better at its to…
ytc_UgwOOPEU8…
G
I just don't get it. What is the purpose of making artificial people? This is …
ytc_Ugz68xXoE…
G
How about attending a national alliance on mental illness meeting and you’ll see…
ytc_UgzUNl8BE…
G
Very well said in your arguments that mankind should and must go forward with ad…
ytc_Ugg3o3NV1…
G
As Elon stated there most likely will be a tax on automation and a UBI to make u…
ytc_UgzdDPNG-…
Comment
Robots or A.I. in the future probably wont demand its rights it would probably just take it like humans do. And since probably everything we humans do in an advanced civilization would be somehow tied to A.I. to live there would be little we could do to stop it. After all we either give in or go back to living like we did in the 1800's as an alternative.
youtube
AI Moral Status
2017-02-25T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh_lhwycoYkl3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggdcRoBuxL9z3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Uggf3fSG7XhI2ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggNUjIVjFsz73gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggzsyIZT_K1A3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugh32Vghx0DeXHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh7NSvs2yysSXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_Ugiv_A9GlQMe1XgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugh4RafM8_B_dXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjj2FpuF6iXDngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]