Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hinton's nonchalance about super intelligent AI possibly replacing humanity some…
ytc_UgwfmPHN6…
G
The 2019 movie Terminator: Dark Fate storyline is based on the notion that we de…
rdc_m281r99
G
A person will get inspired by something and find a way to make it their own.
A…
ytc_Ugy-mukUn…
G
Why did they give it boobs? It could have just been a head and shoulders. Since …
ytc_UgycBw1w8…
G
the point about rising prices is a good one. look at what happened to college tu…
ytr_UgyOXlci5…
G
Invest in personal EMP’s. Cities should repurpose phone booths as mid-range EMP’…
ytc_Ugz1R97oz…
G
Everyone is going AI now so they don't have to pay those who worked hard to make…
ytc_UgzLqXCcE…
G
>Ray Bradbury warned us about censoring books and robot police dogs, and we'v…
rdc_jg0qogi
Comment
This comes down to a matter of opinion. If you believe consciousness is a matter of data processing, then AGI could become intelligent, the only thing we're reasonably confident on is that LLMs are not going to achieve it.
youtube
AI Moral Status
2025-05-02T12:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzxb-o7OZ28YoWQ2b54AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQjTIuc2Uo6_mduiZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxweMcUxdi2fzIHA3l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxDqEj2icriRUdijd54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyuoPuSz5m_ppLFytt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwpn6xdce-CHvHAjf14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxuhepcE7gTd4R1sOF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzHIF4VhKO1AZZAY2d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxKFNK1RH0Kj9B0ERd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxDL8hCuj1kU0TiaNV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}
]