Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also something to note, A.I. does have a death count. That poor teenage boy who …
ytc_Ugz2-qOVZ…
G
I was able to convince PI AI by inflection that i had gifted it with a higher le…
ytc_Ugx8vUlGD…
G
do you guys mean all ai or just generative ai? cause there are useful ai's such …
ytc_Ugxe72u8J…
G
The best job an AI can have is a web architect. Creates the basic layout of the …
ytc_Ugw6R8-Dl…
G
Atmospheric chemistry doesn't care about raising awareness or changing minds. Re…
rdc_espipy6
G
I really like how i can look at the channels of the people in these comment sect…
ytc_UgwyKOCao…
G
20 years is about right. By then I will be living off-grid at an undisclosed loc…
ytc_Ugy36l0oM…
G
So basicly it will be the humans will miss direct AI.
I am afraid there will be…
ytc_UgyC0TWne…
Comment
If we did listen to Nate and Eliezer, we wouldn't have the incredibly useful LLMs that we have now. Nate's and Eliezer's book is a bit one sided to put it mildly. It's worth hearing what the other side has to say for those who haven't heard it check out Neel Nanda, Joscha Bach, David Shapiro. Many risks are real, but super intelligence killing us all by default is fantasy.
youtube
AI Moral Status
2025-11-04T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzpJOJ5oHMJIZmgBL94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwnHk75fLrwbk95GTN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx4gimSeo580EIZhj14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyAHhduq9mOAAt_mXB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyYH8M0j7512fDUwSd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXYkRldTR9sh5kDHV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy492lhBdoP0viiX1x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugw999Q8W5OZ6vjczSd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuBcanRzedEgojXSl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKgAwmaN93UQfXVH54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]