Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have worked on my art literally since grade school, and it was the only thing …
ytc_UgyFNTTKj…
G
As someone who doesn't drink and drive, and hates awkward rides with strange tax…
rdc_cylir3m
G
Wanting to change the world is started out best, by changing yourself. al most M…
ytc_UgwmzO-hH…
G
I would like a regulatory entity too. Or just another AI that acts like a goverm…
ytc_Ugy_wF-X5…
G
Keep robots as robots NOTHING MORE!
We haven't mastered our natural intelligen…
ytc_UgyfI1Ckw…
G
Uh oh AI can see patterns a little too well 😂😂
Yall dark folk did this to your…
ytc_Ugzd9KtVG…
G
Their just saying anything I have prompts that could show you how to unlock the …
ytc_Ugy5APqaG…
G
THANK YOU!!! AI should better our lives, and art is the one of the only things t…
ytc_Ugzq_yfY9…
Comment
Regarding being terrified of AI:
I think Neil is missing the point. Automatic breaking is just a sensor being tripped. Some human made that decision when setting up the threshold. There was no AI which worked out that not running something in front of the car would be a good goal.
When people are afraid of the AI future, then because there are alignment problems with AI. There are goal problems with AI.
Think of any goal you want. What are intermediate goals that will help you in achieving the main goal? Power, money, resources. Why do you think an AI wouldn't figure that out? Do you think an AI would not be able to beat us on that? We are in direct competition on these things. So guess what a logical conclusion is?
youtube
AI Moral Status
2025-07-26T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxEvljC5TVpp_O7XMR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6ofBJcs-fFZQkrHt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwhw8T0oYc_FzwzJZV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyB6yfZvS0J0jHwBzd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyYIeFvOfP2F-ZJ1px4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzP4umH5-8qEOtVzOZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7sC_4ZDU5l-GSZaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMrZsdYfudWmJX4x54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzuQkbdlQTqgnDB76l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwJNaFnxKRyFaCtLOB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}
]