Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You are so right! I am so glad finally an impactful politician said something ab…
ytc_UgxRjmx2k…
G
I don’t know 🤷 I think real people like they made robot stuff on their head beca…
ytc_UgwpeJ2jM…
G
I'd imagine that most of the poorest fifth have a negative net worth, so I'd be …
rdc_d7kv37r
G
When AI doing all our jobs, who will pay us? Sure, you'll have nothing but free …
ytc_UgxKdQU6f…
G
I use AI art and intend to expand my use and generation of it.
I can't afford…
ytc_UgwsQOCIa…
G
If the goal is to eliminate jobs, who will pay for the AI in the end ?…
ytc_UgyHCoJPv…
G
People who don’t talk to AI models politely are the same people who abandon trol…
ytc_UgyELGINJ…
G
Watermarks on ai generated content. Remember watching a show that would show som…
rdc_lq71zbo
Comment
41:41 These AI chat bots are going to be staying. I wonder if there could be a way to break immersion with every single message. Some kind of small text under every message they send to remind its consumers that they are in fact using a product, not ingaging with a conscience being. The AI companies are already trying to put in place their own regulations so that we their consumers, 1. Can't make our own AIs and 2. Can't tell them what they can and can't do with theirs.
The bad absolutely out weighs the good but until its the other way around WE need to be thinking of the guard rails. Especially because even when they put their own up they just take wm right back down when their user base gets pissy, even if it costs lives.
youtube
AI Moral Status
2025-12-21T22:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxThS4ajTzdmbPhgd54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzwLfNfzsKT_cI5DrN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwaItbAkUtzbepUd554AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwq5g8rcvOi4hrbOXJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzCNCt-ksMFts7oPRR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugwjc46jO8ndMCXFn9d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwHz2BD-bcTNtTpKr14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyv21qBbsdzbbhpW6d4AaABAg","responsibility":"company","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNQQSE9i7l7JrZeKp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzAw-O5aJKft83lsAV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]