Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem is people build crappy one shot apps and sell it to get fast money. …
ytc_UgxHWErfl…
G
I'm calling BS. There's other videos of her with the little round blue ring on t…
ytc_Ugy0RfXBZ…
G
Im proud of you talking about this problem this calm. For me this current AI us…
ytc_UgwWrG7t2…
G
remember when AI was just a funny thing to make low quality short videos? ahhhh …
ytc_UgxfioCEo…
G
most people behave as machines anyway.. just one more robot player in the robot …
ytc_Ugyv6-6Jy…
G
seems like sm just letting it happen they must have people in sm that perv for I…
ytr_UgwT2FH3v…
G
i know where you are comming from, but when a diffusion model gets not enough tr…
ytc_UgyK3IpWz…
G
@ronaldov09 True, it’s already replaced some jobs no doubt. But there are a lot …
ytr_UgxvIxQke…
Comment
Whether it's feeding training data, and that data can potentially be strangled out of the model or not, ChatGPT is by far the best therapist I've ever seen, and I've been around a lot of them. 100% the chance that any of my mental health information being in chatGPT's training data comes back to haunt me, it will have been WORTH IT. This video kinda like telling someone who was born deaf not to get a cochlear implant because it can cause infections or something. Like yeah maybe it can, but the potential benefit is way more worth it than the potential consequences. And if general data privacy and having your data fuel the big bad microsoft is the concern, let it be known that I have switched to Linux to get as far away from microsoft as I can, and all the same I still say that it's way beyond worth it. Where the benefits outweigh the means.
youtube
AI Moral Status
2025-06-19T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxs8bUedSenCG0a1E14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlQjCTQCM4H2Q_6OR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzec38gpHqt25G5qNp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwuM47aWyVuyR3iqO54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy8S6M649kXDGI-_4l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz516FfGwJ4Chj3MBR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRnCzzAqrrCpBUunJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8gFvR0n3449ZLw_94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzjX9J736jjK1lNOwp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzM4J-6CMVOZ7VvnYB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]