Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We’re already controlled by robots you are literally holding a small brick robot…
ytc_Ugy-aXAmb…
G
AI "art" at least tends to be technically competent. Too many artists seem conte…
ytc_UgzGmO5lV…
G
Idk would we need to be right everytime with alignment? If we have an army of wh…
ytc_UgzM8b_np…
G
@Jackson_Zheng that's pretty cool. I guess the best part is the lack of bugs. Sa…
ytr_UgzjxAymY…
G
Consciousness and mental illness is not the same thing. Besides research into co…
ytc_UgzWvRulA…
G
The tyranny of the algorithm trained on biased datsets
And worse is when it's al…
ytc_Ugyo8yhqO…
G
I wonder if she has a headache every night when her robot husband wants to get s…
ytc_UgyJjOfCv…
G
No, it's unlikely that any AI, including a robot AI, would "destroy the world" w…
ytc_Ugw4q-JNl…
Comment
Companies already can know and have access to most of your intimate stuff. This video is SO oversimplified, as AI's training data isn't as simple as "citing your own information" as that's but a fraction of its knowledge. Even then, you can simply disable the information used for training in the options in two clicks, or simply create a temporary chat.
Trying to demoralize good AI usage is one of the stupidest things ever, especially for a so called "tech" channel.
youtube
AI Moral Status
2025-07-02T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwrSpaYcqvAKnuiDp94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLaHIZ6MDLPBxrc1p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxCXkkgpeLHC4x0PKZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwmA1eEpWMxC1FLBAR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx9tT1vXGWYg2BvkiV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwg1pkm4mH-KjYTLbt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwH9JFnemOC1aYUh0B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzhPP1cW1hvSFs5xo14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwfBcu3cuECQVE44s94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyJwZpVVEMKqsJqQZ54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"}
]