Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Stop caring about the ai, just do your own thing losing hope will just let them…
ytr_UgxCdMeR3…
G
Clown Logic: "Using AI to select ppl to check is bad but using yr eyeballs is ok…
ytc_UgxTYIn5F…
G
@patiopooper never said it wasn’t. All I did was say. An ai can definitely make …
ytr_UgzkuSQdH…
G
This is scary. We are definitely living in the end times ..that's scary that a …
ytc_Ugwf1L7-b…
G
I cannot explain how dangerous this is, as someone who has studied under this ma…
ytc_UgxhpL5Qu…
G
The AI utopia is just a false dream, humanity needs to do more to stay connected…
ytc_UgxdVqUno…
G
Hey I also knows autonomous cars but it needs to be controlled and if it's contr…
ytc_UgyxLpPv-…
G
So Star Trek: TNG was a BS view of the future even if they have AGI and a comput…
ytc_UgwMlMNGT…
Comment
Also in my opinion, not that I know enough to deserve one on this subject but I'm absolutely positive this guy needs to be regulated. He seems to be developing AI as fast as he can with minimal care as to the consequences. I believe he understands the consequences but cares more about completing AI than the overall consequences of creating a general intelligence.
youtube
AI Moral Status
2022-03-15T20:0…
♥ 21
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx4hmbACYzr-8dVB-d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8Xs8lAmaFnJkZoht4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1DXySmx8H-ijgpLF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxZpYArQyuJTKgeXJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8cwXp6Hh8e5L5qq54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwcNw8YVrWzS7z2Fbp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwtOpRXaqzVarQgErR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwGwjbC-rD_picIOeB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZZEPfBTWypDauyqh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwSH3hS8W2O2DNMKux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]