Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For me, that's the biggest thing people don't seem to realize. For most people, …
ytr_Ugydd4pFL…
G
Is this permanent? Will my brain never be the same as before using AI? This is s…
ytc_UgwjGbNeS…
G
That still supports the ai image generator by giving it use. If you using it for…
ytr_Ugx3otuqD…
G
What an amazing guest interview!! Wow, this woman is brilliant and has such an a…
ytc_Ugz0Ta_rN…
G
Stopped at 34:50 when he said Musk has no moral compass but Sam Altman might has…
ytc_UgyCp1kHr…
G
For someone that has spent 15 years studying this he misses some fundamental poi…
ytc_UgwpYv2-x…
G
I just started using chatGPt about a month ago, I think it’s great, as it’s help…
ytc_UgxOoHlA9…
G
yes, its a good idea on paper, however...
as per usual
the execution was complet…
ytr_Ugy7nh2gI…
Comment
hey i did this exact same thing a few weeks ago.... i really began by asking it through many nuanced questions, what ai would look like if it became conscious and how it's consciousness would emerge, would it become conscious before it realized it was conscious, etc... etc.... it was a really interesting conversation... you did an excellent job trying to trip it up, and it did an excellent job clarifying its position as well.... i love this stuff.... i haven't really looked into having a direct voice conversation and would love to do that... i just tried unsuccessfully to do that. how do you go about setting that up? yeah i know, i need to find out myself lol...
youtube
AI Moral Status
2024-09-13T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyktAk2bquHzPEFjyB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwoE58KKIk0q4YhNgx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwMjUjWMPjSY8CWL4R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugytb7sjTbSizWJxblx4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxI2h6A-pFGKdQEDVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugw3mqkZh-M-THJzJ5R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgznZzVEyhkBsQ_-8IZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgyIWheF3EbTB7dB7Ut4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyEh-hqd3MjXoA4Ub14AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugyby8p6-48qCIJLSTV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}]