Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well... yeah, people now still want their own self driving car, but some don't, …
ytc_UgyOwTx1l…
G
Automated manufacturing. Any foreign company that builds a plant in the US will…
ytc_UgzVh9Rpa…
G
AI made the decision to prioritize life over financial damage. That's probably a…
ytc_UgxpoWIBj…
G
@bernardwatson9563 Thank you for your comment! Maybe in the future, we'll have r…
ytr_UgzikSKw5…
G
This entire discussion is meaningless. It doesn't matter if AI generated produc…
ytr_UgxJQc_m_…
G
Lets talk about the real issue here: they spelled "absolute" wrong.
>its an …
rdc_czl7cak
G
The thinking of AI is all based on human input. And humans are known as violent,…
ytc_Ugw8ivO9C…
G
Personally, I only generate “AI Art” for funsies, for recreation. But to be hone…
ytc_UgxSG9yMI…
Comment
I often think that we humans will cease to exist one day. But in the meantime we will have created artificial intelligence "intelligent" enough to evolve for themselves like we did since the dawn of civilization but in a perfected way learning from our many thousands of years worth of mistakes. They will be a perfected and more efficient "race" in every way and will live as a corrected reflexion of what we should have been.
youtube
AI Moral Status
2020-06-14T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx6ldpxbx3SzabORuZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxibPKJJBy2r_y7puJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz_qrfReL5oYv6DDTx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPS73G_XxZJnSoloR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaEYHPUWgtIaUOt0d4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwmlSmJNq5nm1GZmTV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw1dLMaXimwSvjs_Lx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxnfqDwq7AGy3amLxR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxmAnrNsLvMis0SpHt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz0ckEtdKAgZiBnNtd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]