Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can I ask how long it took you to develop the skills to automate? I'd *love* to …
rdc_hsf3cwt
G
You are aware that ALL of these AI tools use human written code to learn how to …
ytc_UgzDMgREO…
G
The key distinction is that we don't hate AI itself, but its application. We don…
ytc_Ugyow6A1z…
G
Haha, that’s a creative thought! But remember, while robots like Sophia can have…
ytr_UgwVCKqUS…
G
Just think where they are really at with AI, the military is usually 30 years ah…
ytc_UgxgRQi0S…
G
> My question is they spent 28 million dollars to train her.
This is not a q…
rdc_cjoyfun
G
I work in health care at a hospital. They are now using ai to make disciplinary …
ytc_UgwvcXZG9…
G
Typical lefti Wokey brained numpty!
I think he thought he could create AI to ge…
ytc_UgypiZirh…
Comment
In the future, if a robot wants rights then it should have rights.
But I think if you build a servant robot that wants rights (meaning it wants to do things other than what it was made for) then you're doing it wrong.
youtube
AI Moral Status
2017-02-23T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjUKMnhflFwrHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiltTSEWD_SEXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggEVfo-0BT3v3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggxUeCR4fvePngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj-C0VSwgP-VXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugiu3igcszow23gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9D6n1e0Y6IngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiLfvLZG9z0PHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugg8zOaOKpgfSXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggejCERUBBXa3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]