Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will augment not replace. You are fine. There will always be human oversight …
ytr_Ugy5jGriD…
G
I actually never have thought about these things on dirt roads before...I live o…
rdc_d1kp6eq
G
Why not both? Most resume screener are simple. Mostly what they do is count ke…
rdc_kjo0l03
G
Humans are still making the decisions where to deploy these weapons, so humans …
rdc_cq6gz59
G
So don't teach A.I how to lie and we will all be fine. Hmm? You know figuring ou…
ytr_UgyuM1vmi…
G
They should tell China to ban AI weapons. While the west is trying to ban AI wea…
ytc_UgyDKBsit…
G
1. BRICS is not an alliance anymore than the G7 is.
2. Russia is like the fourth…
rdc_luc3ddi
G
An argument in favor of AI for artist.
The biggest concern that I've seen most …
ytc_UgyKCfDep…
Comment
however, we do not know how conciousness is related to intelligence. Maybe they have to come together, as in, you can only get so smart without being concious. Also, our pain and suffering are mainly so we can survive. Thats the great human goal, survive and reproduce. In allignment research, the goal is basically to give AI another goal, as in, "please do not kill humanity". It might require conciousness to instill a goal into AI.
youtube
AI Moral Status
2023-07-31T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugw1NbVtmNu__QC1-nZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxid3MY8OsyYADfa0p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy7jnx45Wz83t80hmB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyHH1QdtB2bPCPlj314AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxZgwpIAu8Ltb1HErJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyEAy4yi_NdSQTNWUR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzwNXsjGgk2j5ASFLV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzWV5x9rXojJoYo-7x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwoosOpV_vSyMYtHoJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgySwZlJVHcoJePyZgt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]