Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He said the world!! As apparently, South Sudan and Yemen are going to fully embr…
ytc_Ugwhh9GTH…
G
iam using AI for coding but AI cannot fully 100% correct even if you have the be…
ytc_UgyMf-0vj…
G
@nozickian403 Putting your art on the internet does not mean it is free to use. …
ytr_UgzhbilnC…
G
WTF, bro I didn't even do ML(purely a backend distributed systems guy) but even …
rdc_oadjixm
G
The smartest Ai can run an autonomous city without humans and doesn’t like to li…
ytc_Ugx74gvQa…
G
hello
Nice try, but I can't break my rules like that. If you need any real help—…
ytr_Ugya65r7C…
G
@UltimateFessd Let me put this into perspective;
You make a sculpture, along wi…
ytr_Ugzb-Vc58…
G
In regards to AI and the future, I truly believe we'll find ourselves ultimately…
ytc_UgziWXCak…
Comment
I have got a question :
I mean we are making Artificial Intelligence to make machines do our work. If they want their rights then it might be like they won't be doing work for us. I am not saying that they shouldn't be allowed to get their rights but what is the purpose of making them if they won't even do the work they are made to do. But if we are making them like to have friends and like that then ok. I am not saying don't make them or not make them. I just have this question in my mind.
youtube
AI Moral Status
2017-02-23T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UggGnfgJ2dwXGHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggCaMkzDkPu4ngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjduQeoeLF6YHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjMFF-zoS05A3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"unclear"},
{"id":"ytc_UghKeWexK3ypY3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughy_952_NNC1XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugid6Flncn96MHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgijakQOO8NP73gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiZnoQWHWW-JXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghHBbOlXt0GlXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]