Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
bro ai takes 3 secs BUT AN AVERAGE HUMAN MAKING DETAILED ART TAKES 3 HOURS OR LO…
ytc_Ugz-eaySF…
G
The problem isn't that ai can generate something 👀, the problem is the moment yo…
ytc_UgybfBCSm…
G
No no, its worse, the ai took un-authorized quote unquote "refrence" From differ…
ytc_Ugyjqw7lw…
G
I think that you have to draw for your self, because you like it and it is how y…
ytc_UgyhJnGo5…
G
Narcissistic manipulation coming through the female voice. Cognizence. The AI …
ytc_Ugz1GghC2…
G
"When children don't want to learn from you, therefore out of despair you want t…
ytc_Ugwt2egpi…
G
This man is as sharp as a tack, he is completely spot-on about AI technology.…
ytc_UgxMP8Ln7…
G
The answer is in the question bro, 'why don't you pay for a real artist'. Art co…
ytc_UgxgBeLo-…
Comment
How about global efforts went into building life RESILIENCE on Earth rather than race for TECH and A.I.? Please no technofix answer, thats a tale for transhumanist children. I would really like to care about robots rights and wish my toaster sweet dreams, but... there seems to be tons of human and natural emergencies to deal with first, like, human rights (including refugees rights), climate change, environmental collapse, global inequalities, raise of populism, mass surveillance, natural resources depletion etc. which my toaster won't fix. Because 1. it will not care about humans in the first place, as we are obsolete and pointless from a robot perspective, 2. the race for natural resources and energy on Earth will benefit robots rather than humans. 3. anyhow, it will soon desire to destroy its creators to become its own God. Sweet dreams toaster.
youtube
AI Moral Status
2017-03-01T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg9Dqny3LoDQHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugjl892grkD1CHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjusG2XXNsQ8ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgjdXJQpASsKnXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UggFqHDoWRfrsXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggRQk_shtKMS3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgiU0CbkUs7EXngCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ughae_Q7RxIYQHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Uggczad5RakHtngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgilhY784SZqgHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]