Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Drivers could be replaced but it'll take investments by every customer of the tr…
ytr_Ugg11j6LA…
G
Why is there so much hate on ai art? He found something that works for him and h…
ytc_UgwGR9Vpb…
G
I don't think they'll win the case. AI is a tool, like a knife. If their son kil…
ytc_Ugwb4moDZ…
G
those that buy into the self driving cars are part of the problem, i dont mind s…
ytc_UgzbBfodq…
G
The funny thing is, as an AI enthusiast, I actually am happy to see this. AI pro…
ytc_UgxbDs6_G…
G
Youtube is also AI upscaling regular videos. See video LeG1JTpl6pc in 1080p. It'…
ytc_Ugwn6_FRV…
G
Apparently the AI is always right and is a good thing until it says something th…
ytc_UgxkJ2GU2…
G
I'm with ChatGPT on this one. ChatGPT tries to be practical, while Alex is talki…
ytc_UgxrvtjLB…
Comment
Eliezer Yudkowsky is directly responsible for a majority of the people running these companies being involved with AI at all, the discourse around AI being focused on trying to make "AGI," and for the acceleration-wing of AI companies (see Elon Musk) believing that they should try to make AGI as fast as possible before "the bad guys" do.
I don't think we should be giving this book and these people any more attention than they've already leeched.
youtube
AI Moral Status
2025-10-31T00:5…
♥ 23
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwrLmycNPBt6-Jy9_t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy_B8qiu32XqdMW0bd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2qgHGv3OEnQWYM6x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJq66Dr8wK6u1VAyt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzsm1c33eOVut0mgC14AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx5USNcAfrt868_Awx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzcMawkbb6I8kgrcZN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzeO5pukZ_tjKUncdJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw_AZ-7DzqLMgm6wWB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwLlGY6aAQe6o9LOSl4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]