Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just one point to note:
>researchers conducted an analysis that covered all…
rdc_fvw4m82
G
I often feel disturbed by how people conduct themselves during AI conversations.…
ytc_Ugye8P7l1…
G
There is a fundamental flaw in AI staffing replacement. Get rid of employees ce…
ytc_UgyfB9d6w…
G
Can’t threaten me with something that doesn’t exist.
I use a different AI chara…
ytc_UgyXGc4Gr…
G
I know I'm gonna' get a lot of flack for this one, but: AI does not steal art. I…
ytc_UgyjnQvhw…
G
All I know is humanity will have collapsed long before humanity ever sees an AI …
ytc_UgxziCgBd…
G
Nah, sorry but if you take your code and break it down into chunks, feed it into…
ytc_UgzMAePL0…
G
Doesn't matter anyway. The democrats almost got control of America with Republic…
ytc_UgyN_ASAw…
Comment
Jet Black I hate google's interface in replying to specific people lol. My reply was to the OP, as yours wasn't shown on my screen yet.
That being said, I do disagree that current robots have the potential for consciousness. I mean potential as in that entity will obtain it, and no toasters made today will ever obtain it. I am more talking about A.I. that requires a couple years of training data to really start understanding things, much like how babies mature or how unconscious people still retain sentience so long as they can wake up again.
A bit of a tangent, but the abortion argument parallels this aspect very closely. Once, if ever, that gets squared away, we'll have a good idea of where A.I. rights will start.
youtube
AI Moral Status
2017-02-23T19:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UggAeEyGOWLwAngCoAEC.8PKrFrKHVKP8PL23LM5COY","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UghOZvHHUFgJ13gCoAEC.8PKpyAJodvg8PKv-x5RGGb","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UghOZvHHUFgJ13gCoAEC.8PKpyAJodvg8PKwpTqLs6F","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgilbiiByK6t2XgCoAEC.8PKopNCybwI8PKsE8YAn-m","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgjSV_17zxqnKXgCoAEC.8PKo7A43RufAAt3SmiS0oL","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugg3Gq3iNadmtngCoAEC.8PKnw7QktlE8PKyLS3rwGm","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugjj8IHpZfnxuXgCoAEC.8PKnTtN5daI8PKxLoX1UPh","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytr_Ugj55TnUfAY1t3gCoAEC.8PKnOLstUyW8PKuyC3D287","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugj55TnUfAY1t3gCoAEC.8PKnOLstUyW8PKvOoweAW3","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UggIk_cdpwuOu3gCoAEC.8PKnLp4C2128PKvHvqzlOi","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]