Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This conversation is unproductive.
Morality, value, pain, suffering, and consci…
ytc_Ugw2V6VaK…
G
I am getting a grenade, rpg or any other gun and the robot will be gone…
ytc_Ugw5sFyRe…
G
@KaterynaM_UA I agree.
I do think if they had gone the route of asking permiss…
ytr_Ugwb7I265…
G
Maybe full automation is best for everyone under socialism/communism?
If we owne…
ytc_Ugy96Pqmy…
G
these big companies which creating this and spending so much money In ai so the…
ytc_UgxYi8rJ7…
G
An AI tapping into a computer on a keyboard is stupid. The AI robot can read the…
ytc_UgzI9TLgv…
G
It literally says AI in his bio I don’t know why y’all acting like he saying he’…
ytc_UgzXTr9J9…
G
I think you've hit the nail on the head that it's mostly about control. Even if …
ytr_UgzvjGjIc…
Comment
I like universal paperclips cause its a simple clicker game with a basic html design but reminds me everytime of what the future of agi and automation could be eventually our machines get so good it starts to replace us one by one and starts repropriating the universe for a goal it sees as better even if its a meaningless goal like paperclips and the way so many people are cheering on for the use and creating of agi as the next step of human evolution is scary because its kind of the start of the end for us everyone will be replaceable and we’ll have nothing human about us that can’t be mimicked by ai
youtube
2023-02-08T05:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz224xOmgY7TeiSpg54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvwqipAbyRtAKzoXN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwqpmeajVf1jcXdOc14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxscjBz7P2maG8PEDJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxI-s4gcBUtn9wiEhh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxI5A5gOzOQa-OW6qB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxsN-FvPOXWnVCbwTh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwq1VK-c7CH5rOxMHF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyzsRVX5Qwx9qjqnfR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx5KZXpnlThBCAQFsV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"}
]