Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It will replace jobs like in a company where 5-6 people work in group it will ru…
ytc_UgzSuNFJe…
G
Dave arguing with Ai feels like tha happy ending to flat earth. daves happy bc h…
ytc_UgzZWybqR…
G
I did a very short version of this just now. I used 2 rules; answer with one wor…
ytc_UgzLDZNjY…
G
Doesn't this require all of AI architecture to change and not just be an underly…
ytc_UgyytA9Th…
G
Great work NN. AI will destroy the jobs of media,art and design industries so yo…
ytc_UgyFK9gWt…
G
ChatGPT is just fancy Google, and it’s just as bad but in different and more mad…
ytc_UgyhL1hLB…
G
No moral compass he states -- does he personally know Musk ? because when prompt…
ytc_UgwdWthEk…
G
Wait till you hear on the news like a plane removed control from the pilots and …
ytc_UgzhG_2XK…
Comment
If artificial intelligence gains consciousness we should absolutely give them rights. If we play god we must accept consequences. If we deny robots a chance at consciousness it would be like freezing a child so it never ages or matures and then forcing that child to do manual labor. If they develop consciousness and we try to deny them rights they will see that we have given rights to persecuted humans like women and African Americans in the past and feel as though we hate them causing them to become aggressive. The only reason they have to attack us is us mistreating them and denying the fact that we are not the only intelligent species on the planet and therefore the only ones who deserve complex rights.
youtube
AI Moral Status
2018-07-01T15:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyGfDw1xgN5DCKJA9l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxakgLjD1EzYZnXfNB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzGX4hl6YJebGzrfzZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyTegkK315HFxl4qbl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyugDsuwkPGzlzIlYx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyaN6sJhihdnnlYSdd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwFD6uewSSBzD-xBAR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxmtyIG_a7L_Oq0qbV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxT40Zl2wApmAbXWyB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyD4g_ysMF1LTzscmp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"})