Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What a bullshit...Ten levels of A.I should be the damage it can do to humans, im…
ytc_UgyzrPJjy…
G
It is crucial to remove the sycophancy of the LLMs. Now the main purpose is to u…
ytc_Ugxy6HuYJ…
G
Maybe be we don't need jobs anymore in the first place. Ai will graduay replace …
ytc_Ugwbxd7Ep…
G
AI will give birth to the Anti Christ prophecy and the rest goes on. Look it up,…
ytc_Ugwy4PKI3…
G
This is Einstein's letter to President Franklin Roosevelt warning against the de…
ytc_Ugg4you0I…
G
“Ai art gets inspired by others just like real artists do” are you comparing me …
ytc_Ugy1kpKe_…
G
So we’re conveniently ignoring cheap Nigerian data annotators who worked in Open…
rdc_k33zsh1
G
The thing is that, AI will now how to kill a person duo to all Medical DATA that…
ytc_UgwHHvlAr…
Comment
It would be an automate; automated machine, not AI. For a thing to be autonomous, tully autonomous, you NEED intelligence. Intelligence of a small carnivore would be more than sufficient for an autonomous killing machine you propose - yet, a simple automate would be much more practical and cheaper, and an UGV even more practical and cheaper. No - we will not. In 10-15 years we will have those that can out-compute us, but still not out-think us (they do not think at all - yet).
youtube
2012-11-24T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwiBO59xpLPCkecBqd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_uEjQogI-wb-bmPB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7nuMjJl6i0N6vTmZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugys3yBMDKkxZIDT7cd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxNKwh_r49bvXQyVH94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxH2eMZsO_x_AjYfy94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsRrYrWDDcgzPrJQN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwcPYYDaK_--Y12hiN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXAQENEamBTwM_Aht4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugykgw-hR28QCN8z1ep4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}
]