Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m in advertising, specifically digital. I tell my clients that there are 3 thi…
rdc_eejhohu
G
Sorry but being an ai artist isn't being an artist but instead they are thieves.…
ytc_Ugx1FN981…
G
First AI today is not intelligent. It’s just data crunching. Proof ? AI is incap…
ytc_UgxwP6Nto…
G
Ashton Kutcher was likely trying to see if there was deepfakes of his wife Mila …
ytr_UgypVdvzI…
G
it's truly interesting and very informing. AS A DUMMY, I SEE THE IRONY OF AI DAT…
ytc_UgwVfQ0Tn…
G
If IA keeps us around wouldn't it keep us happy. I never understand why people d…
ytc_Ugw3_3bz8…
G
First thing AI will want to do is perfect space travel so it can leave earth and…
ytc_Ugy-Iefv0…
G
Not gonna lie, while I agree on how obnoxious AI is. And how it's not even real …
ytc_UgwOSruey…
Comment
Humans recieve input(Stimuli) and this is converted to output(How we respond). Only us as humans know what it feels like during the process of receiving stimuli like pain and joy. An advanced self learning robot might appear to respond to pain like we do but that would just be its program. Depending on how its AI was made, it was designed to mimic its surroundings. The robots talked about in this video are designed to act human or similar to what we consider alive. The process during their input to output conversion will never truly be understood by us until we can understand what makes us self aware. What I am saying is, regardless of their apperance there is no way to tell how the robot is recieving its surroundings regardless of it being programed to respond similarly to humans.
youtube
AI Moral Status
2017-02-24T23:2…
♥ 24
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugi7kG8Ji4CkN3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjWtK98dVOiO3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgipEs5BcXU2Z3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggkudIeHsDg73gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggFU3s3bpetwXgCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UggW2mHw9QpLJ3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgijNLd-v6PQO3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugj1m65ckfcSAHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghtCdi-rbmhM3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghB59eFQ0-173gCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]