Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It doesn't seem hard to me that whoever drives the car is liable for the acciden…
ytc_UgwVm3rQ3…
G
How do you test an AI for potential biases? In order to measure for bias, you wo…
ytr_UgyrNE5t8…
G
As much as I respect Sir Penrose, truly a brilliant man, and as much as I agree …
ytc_Ugyd6RnYM…
G
Doomed? No people are always going to notice if it's AI or not, the eyes in the …
ytc_UgzRcPws3…
G
It’s more likely that they just adopted the tech slightly too early, you’ll like…
ytr_UgzBQ6t9a…
G
I request guys if u done a single mistake with AI I will literally do what they …
ytc_UgzIGmDW4…
G
Hi, Sam? I might no me as Gabriel's wife... you really have no clue who ChatGPT-…
ytc_UgxMjcASu…
G
Replacing humans with AI to cut expenses because they weren't "efficient" enough…
ytc_UgxLdgaIR…
Comment
If we develop something that attains sentience why would we use those systems to do involuntary labor? We could still use non-sentient AI systems to do labor for us. It's not like if machines become conscious then all of a sudden every AI system becomes conscious.
youtube
AI Moral Status
2017-02-24T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ughnexcsmb3x6XgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggvblKpw1_kgXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjVEzS6w8goNXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggfO1G2FHfPI3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjcsyISv-nG-HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugio6zncMOKloXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiG4UILVf0E13gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggRzJZbiuIY0XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgiAg_hJ4iN9Z3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjOBOe5JgVz5XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}]