Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Automation is coming wether we like it or not the question is what are we going …
ytc_UgzTzj7g6…
G
I agree with brian cox, in terms of some terminator level ai that will kill huma…
ytc_UgwTqeo_y…
G
UBI will never happen & these 2 assholes know that. AI is just another tool to c…
ytc_UgwLXheIV…
G
Is it worth learning to code in 2025? No. Is it worth it to create videos promot…
ytc_Ugx0rNiw5…
G
It makes me sad knowing the world being built by the worst of us is to suffocate…
ytc_UgyViSLoh…
G
Why are we throwing such massive resources at something that we can tell is goin…
ytc_UgzKWhsGI…
G
It's so disappointing that this former Google engineer thinks a chatbot is alive…
ytc_UgzHO7t9Z…
G
"Fix the halluciantions" is a much more complicated problem than it sounds?
LLM…
ytc_UgzrzhONL…
Comment
I paused the video after 8:37…
Why is we disheveled about AI having cracks? Humans have been killing each other and testing the moral slide everyday. Those who HAVE a conscious and practice morality everyday should be fine. And those who seek to continue to break the rules, commit crimes, and make the world an unsafe place should ALWAYS face consequences. Stop using ChatGPT as a slave. You want someone else to do all the work for you… talk about history repeating itself ☕️
youtube
AI Moral Status
2024-06-12T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwLU6LhVSUcJ521zTN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwS20qDsAbJeNEFaVZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy1jL3AXaRerelSnMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwA9pLVAGGlHdyXuy54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwiuCGY56MlzYKARcd4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGq7y4rvmhNkL5bxp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwfJXZ9ho7MxrihS-x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx4AM8BESDXd3eCLuB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2jCRAM69PH7wWzjd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwNjI58FNJ6QJ4spgN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]