Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@johankaruyan5536 It functions with language and logic, anyone creating an AI th…
ytr_Ugw7LSjzz…
G
These talks about AI make me uneasy, but I’m grateful for tools like Rumora. My …
ytc_UgzVXGS34…
G
None of this seems right to me. LLMs are pattern recognition machines, biologica…
ytc_UgxN57qcy…
G
ONLY in Trump's America... Melania talked soooooooooo Good about AI the other da…
ytc_UgwUK7_92…
G
The issue here is as old as the scriptures and as clear as the constitution of t…
ytc_UgwyuMJKs…
G
As far as I have seen those dont really work anymore, there are already many cas…
ytc_UgwRx_7uW…
G
AI's open-ended energy demands could lead to more nuclear power plant constructi…
ytc_Ugw0lrIgq…
G
No.
There will be a database of vulnerabilities that the code will be checked ag…
ytc_UgwAnd0da…
Comment
robots don't deserve human rights they are designed to serve and follow Asimov's laws AND ONLY THOSE LAWS
0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
1.A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2.A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3.A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
youtube
AI Moral Status
2017-02-23T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiLDZDsluuX7ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghO27xPtF4OL3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgiyMwZ_7WU5mHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh-nIhLVlynuHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugh6GzVlcqfQxHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Uggd7HuqJgAx-XgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiVAEnmcJsth3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjS4PQpHaKB33gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjZof-spcqFxngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UggrO82HB4K0HHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]