Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do i find this 👉🏻 voice hawt 😂❤ chatgpt be having hella masculine voice ❤…
ytc_UgxHtMnzc…
G
One of, if not the best episode....ever. Its all so terrifying 😳 If that's the l…
ytc_UgweT4oG5…
G
AI IS THE SYSTEM OF THE ANTICHRIST... Turn to jesus, accept jesus as your lord…
ytc_Ugwu-R5f4…
G
These chatbots tick all the boxes for the BITE model. Asimov's three laws of rob…
ytc_UgxRyB2Hy…
G
Hi, you said you wanted to talk, so here I am.
For context, I’m someone who was…
ytc_UgyNzgho_…
G
Battery on an officer. She's exercising her rights and you hauled her out of her…
ytc_UgyOse9v-…
G
Artificial intelligence can be much more dangerous to us than we could have ever…
ytc_UgwNdKu0B…
G
This question looks at the way ur curser moves not the actual answer to the ques…
ytc_UgzOdXwsh…
Comment
Very crazy topic. We will have much more time to think about these things with advances in AI and robots doing work freeing up time for humans to invent and philosophize just like how farming freed up time for man to build civilization.
1:48 "We know what unconsciousness feels like" What does unconsciousness feel like?
Seeing how the brain saves experiences and experiences make a person. Seems very plausible that we will one day be able to create something similar.
Especially if we change the view that many other animals are conscious beings obviously too differing levels. If AI is ran elsewhere like remotely on the cloud then I would think the self preservation of whatever it is running would not be the same.
youtube
AI Moral Status
2019-03-23T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwY3N-4WtXWXKe0kot4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyuow9cFQvRp_8V8N14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzZOLFdiGrukOiVk1B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw6tuFvGs9zXuY9OD14AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4-sSdTVTDLR25DjZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxa6TQT8DlWXG16GJZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzO_sh5Lua2X1HyIVZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw2bDfZIEPM_btX7g54AaABAg","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugyx_DacNBYxzTzvC8Z4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxuDXc_869qvS3abJR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]