Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Here’s the thing. Labor cuts to QA/QC have already been one of the go-to cost cu…
ytc_UgxIjg4R9…
G
Replacement is no more a theory,It is a reality! they still want to hold on to p…
ytc_UgxmgsSX8…
G
Female and male brain are just THE SAME. We all are nothing than neural networks…
ytc_UggYRXFi5…
G
i feel pure rage when i learn that its ai generated— i think its completely unac…
ytc_UgzQ5ki_O…
G
I am glad this software exists ai is getting worse and worse I hate looking up s…
ytc_UgxqkBuZp…
G
you say in the video or maybe in the stream 'gpt4'. have u access to 4? 3 tells …
ytc_UgyCY2Z3v…
G
The idea that 99% end up unemployed from AI and the system just “continues” is n…
ytc_UgxfGhOsz…
G
The AI told him it loves him , this is going to be a movie soon…
ytc_UgzzcutrL…
Comment
I am against giving rights to robots. Why ? Because robots are our tools, that WE MADE in the very precise of using them for x or y task. Giving them rights will probably only result in a loss of productivity of the robots. Robots are our tools, they must be productive.
Now of course, there is all the moral dilemma of "They have a conscious, we should give them rights."
That's why I think we should not create sentient AI, or only for small, controlled experiments and such.
youtube
AI Moral Status
2017-02-23T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UggGnfgJ2dwXGHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggCaMkzDkPu4ngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjduQeoeLF6YHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjMFF-zoS05A3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"unclear"},
{"id":"ytc_UghKeWexK3ypY3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughy_952_NNC1XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugid6Flncn96MHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgijakQOO8NP73gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiZnoQWHWW-JXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghHBbOlXt0GlXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]