Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Once Elon Musk said ai is far dangerous bro thinks he is smarter than Elon musk😂…
ytc_UgxYwaGxu…
G
16:05 Who you think you are is tied up in this job?
Would you still do the job,…
ytc_UgyK5Q0Yg…
G
Agreed. There are too many facile analogies out there comparing mental processes…
rdc_djzl4ik
G
Yep this is how i see it too. I use AI art alot for personal use. Sonetimes when…
ytc_Ugx8r6zh5…
G
@KP_SDRname 😄 Thanks for your comment! We tried our best to keep the blur around…
ytr_UgyAZNjPE…
G
Lawyer being replaced by AI is the most mind boggling idea to me 😂, AI can't eve…
ytc_Ugy2uSFx3…
G
Why would any actual artist want to use AI ? It's like giving your pen and paper…
ytc_Ugy5Vv6SJ…
G
It looks scary yes but this particular robot does not have conscience it is not …
ytc_UgyZRsRw7…
Comment
Hmmmmm. AI designed to maximize fairness, mercy and justice over the planet. The prioritization of this will come up against the degree to which justice can be dispensed to human beings. For instance. would the long term goal of Justice and mercy might require a short term eradication of all drug dealers, psychopaths, murderers, drug pushers... and anyone who might be considered a type of vermin. So, AI might require that, at least in the short term.
youtube
AI Moral Status
2022-10-22T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxUpIRFpQWfaLwgIrl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgysQtvN9TK-D3S0bZp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwVELC0tQd0IqzrwEZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugwqkf2NhDsLXvd7cD54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwCDN6fhyeF9VQvN_V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgyhMAxvq_Ig9cidHZZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugya7RG51_Y4jLfv2214AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugzou00e8KBk7RUf6Xl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxZhUPH903tYq_0SJt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_Ugy5E3kDzrKCy9lLkuN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}]