Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Procreate With AI Pro
The ai part lets u ai generate ur art then edit it…
ytr_UgwsQ6GxV…
G
AI operates on correlation NOT causation, it lacks intent and understanding. It …
ytc_UgzJhs4QN…
G
43:49
lmao, i was hoping to learn something about AI, turns out it's still west…
ytc_UgwWEfQyZ…
G
@shahmirzahid9551 well, "relying" is a bit misleading of a term. it was a low pr…
ytr_Ugwr4KXbF…
G
it was in 2016...and AI did not replace radiologists today but some models today…
ytr_UgzHaVAua…
G
Ok, granted 70% plus of 'Ai music' is Slop. However, in the (legal and otherwise…
ytc_Ugz1-YjXp…
G
People will stop sharing art online.
People will stop writing online.
Because th…
ytc_Ugw-CmsJd…
G
No, it never come close to that UNLESS it learns to write its own code. But unti…
ytc_UgzktCcP2…
Comment
People are functionally fixated about AGI.
You do not need AGI to destroy the planet or at least to have a super mega powerful weapon.
Current tech, neural networks, LLMs should already considered to be nuke grade weapons. They allow organizations to track people, falsify video, voice, come up with high quality text, process a colossal amount of data to query it quickly, and so on.
But this will clearly improve. We are seeing significant improvements pretty much daily.
Between AGI and the current tech there is plenty of space for armageddon level threats.
youtube
AI Moral Status
2025-10-31T12:4…
♥ 22
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxQk-SzCdVoYegV7Ft4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzxRZU5VKTmKwzQ8JJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyiL1LnwkJeQCawLfN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzZ0DE2E9b7FwO7D4x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyxJ-b1wkCyloM02ZB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugz-5L_1I4eGfQxoeKR4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwW9NyN-397Xj9cflt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzCa9KSNNYDuZ5PMpd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzbWCusk3vHuCnM6Cd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy97Jba3UW7VX3M9yJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]