Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't think we need to worry about AI ultimately killing humans as only humans…
ytc_UgzGSMBwP…
G
A painter's brain is trained by everything he's seen, even works of art for whic…
ytc_Ugw5AsvJw…
G
This will just raise the value of real hand made physical art since all art onli…
ytc_UgyRipfOP…
G
@ZBREAD. Lmao dumbass keep exaggerating stupid shit and live in the ignorant ha…
ytr_Ugw63AnEx…
G
For me, I like using AI, but I don't think that anyone should claim that generat…
ytc_Ugz0OlgIq…
G
The funny thing is that the people who created the ai claim that the ai uses art…
ytc_UgwkddZjs…
G
As an artist who cannot use my imagination for mental images, references are alw…
ytc_UgzW0pYLi…
G
🙏 Sir Roger Penrose is without a doubt one of the deepest thinkers of our time. …
ytc_Ugysi2EwE…
Comment
Computers and AI are not alive.
Thing can look like life, can look alive, and be very very inert.
For example, when you combine certain protein chains through DNA structures, the chains actually wrap up or de-tangle, depending on the type of protein it's made up of. It's very mechanical, like a mouse trap.
But DNA isn't alive. Protein isn't alive.
youtube
2024-12-15T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyT-FEVYJsT24j4oFR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtuX9KNE_r00u3J-l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxw9dKSnPOMYcv8imN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwM2pFJk_QVokmIxZh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy-FGQlNFXMywnYsN94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6dfvvn4mhlfyspZR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAfE9w4cXv08WjU0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxM5jxnZpvXA8wXSBd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxKY0q4mDad0kYTeLp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQJeT2QsiWOn4l_VB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]