Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
„all this "AI" thing does is allows you to skip the work part”
1. The work part…
ytr_Ugzf7Jvuq…
G
All of these advancements to replace workers seem to have one bottle neck: resou…
ytc_Ugz_2yChH…
G
We were suppose to have self driving cars in like 2004 and flying cars in 2015?…
ytc_UgzKi7bZM…
G
The problem is that the rest of the world will build AI with or without the US..…
ytc_UgwhHgIQe…
G
You clowns clap at a banana taped to a wall and charge $200 for a small art piec…
ytc_UgytOHs_0…
G
In a more ideal world, AI tools would be productivity *multipliers*, not poor *s…
rdc_kiw8cwf
G
I dunno. Frankly, that... seems like about the best way it could have answered? …
ytc_UgxB_rep2…
G
Humans cant even decide what our own interests are, how can we expect an AI to d…
ytc_UgxeVF3QO…
Comment
It is risky to grant to much power to artificial intelligence. There is a great risk with self learning that robots will not act in humans best interests and things will escalate from there. We have enough trouble in the world understanding other humans or reasoning with them. Robots have their place, but with all things people create it ultimately is abused or corrupted. We have started down another path we cannot go back from and AI along with genetic engineering is some of the most dangerous that could well cause human extinction unless we start making decisions based on whats best for all humanity in the long term. Robots have no soul, never will and if we seek companionship outside of humans, I suggest getting a dog. (Mans best friend).
youtube
AI Moral Status
2022-10-16T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwE65gqjBa_z9DCowd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx-HnVc3T_kNEeGZzR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgweXBMtYzQUT44zg0J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw2hoVumKdRt3B9TjZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzq2YXZ7UhVM3VmrVp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxHKA1656Tt_vw9ZOV4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxiqsqqWauneOlktkl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy9fdwK8npuMq7vSjp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzkFtoX9Flc_wudJbJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugx41w4FLkhMPG5pQLN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"}
]