Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But the things is large companies gonna use AI and not tell people eventually we…
ytc_UgwfFkGxg…
G
Who ever asked for a driverless car? Anyone? Not me. Technology: just because we…
ytc_Ugyet7sRk…
G
Except using self driving cars won't mean all cars become self driving. Not ever…
ytc_UgyaFpuKz…
G
I find these takes weird. All these companies have too many developers and they …
rdc_m80lt3i
G
Is it just me or does anyone else think the CEOs like this guy that want governm…
ytc_UgxwarS-r…
G
FOREST_MAGIC: Portable EMP generators!
There is no such thing as "Artificial I…
ytc_UgxXq5eRC…
G
Isn’t this the plot to Avengers: Age of Ultron?
There’s a lot of assumptions an…
ytc_UgwM7pDKl…
G
I hate AI so much, we're getting closer ro "I HAVE NO MOUTH AND I MUST SCREAM." …
ytc_UgwBToJXO…
Comment
Can someone please explain the following to me........If AI depends on a million Feynman GPUs packed so tightly that they need massive water cooling in a vast data centre..........how can AI exist inside a litre of android head-space...........and please don't just say fast datalinks.
youtube
AI Moral Status
2025-12-14T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwSapLRfxZc2aDJ8tR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxTq1Boru0PXMHo5lN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxvZf9DclKP4fvKh4h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxsNZ_WOh_3oe8FWcl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyjS5yBjG0dVaoRKn14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxQL2QUDBiOKBBBond4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxbeMw2FfPcs99wRxF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwyrIoUTw1PY88mdDl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzUdqCMPucOO57NL8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyoB5lwt8raCj54NY14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]