Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is the first post I have seen telling me how to talk to my LLM. And I disag…
ytc_Ugzazu8N-…
G
I feel like they made the images super recognizable with the gross yellow filter…
rdc_my5ue1n
G
I haven’t watched the video yet, but as an early AI programmer dating back to th…
ytc_UgzZulSU8…
G
Pls ignore the bots wanting you to quit “bc AI is future maan!”
Hope you’re doi…
ytr_UgyC2cu70…
G
Soooo why do people care so much? It’s not like I do AI art im just Confused abo…
ytc_UgxYn_9N7…
G
From what I know about child development and how the human brain works and grows…
ytc_UgxackYgs…
G
Cornell researchers estimate that by 2030, AI‑driven data center growth will add…
ytc_UgyLQuCaj…
G
AI art ain’t good anyway man, even children’s drawings are 1000x better than a s…
ytc_UgyGXc6I_…
Comment
This could lead to many deaths, many wars, many misunderstandings. This is not a good idea. There is no such thing as artificial intelligence. They do what you tell them to do, and you can't just tell them to be smart. Code, algorithms, wires, circuit boards, all these things simulate what cannot be created. I do not want to befriend robots. They don't know right from wrong.
youtube
AI Moral Status
2016-05-24T02:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgiW21Sv3E9omHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiIuG8UVRP1w3gCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgjcElIqQv6olXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UggcAyu0Ca34FHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghkfZvsZ1nFnngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugi9hLt4C9HmZngCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugjew12eU2xL3HgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UghQoAvhXHfpQXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgguE1yUkw5XfXgCoAEC","responsibility":"developer","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggfVQgxyNbqoHgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]