Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There are already driverless buses and trains. Driverless vehicles and mass tran…
ytr_Ugx8ACDk6…
G
this guys is an asshole, i want AI to take everyones job, AI will help everyone …
ytc_UgwrwFFnT…
G
And let us not also ignore the fact of just how many workers turn to driving tru…
ytc_UgwQOYXnp…
G
This is one of the most well thought out and articulated videos I’ve seen about …
ytc_UgxWFj6PD…
G
Please stop using character ai, for your own good. I got morbidly addicted to th…
ytc_UgyQA5WOq…
G
Demand will remain the same, some art directors will simply choose to cut out ar…
ytr_UgwXj1cUJ…
G
Stop using China as the boogie man they would never allow AI to take away jobs..…
ytc_UgzWBxjeI…
G
I hate when AI companies use the word "safety" but what they really mean is "cen…
ytc_UgxvTPF4a…
Comment
The most plausible explanation is that critical organizations are raising alarms about the frightening nature of AI and its potential growth without increased human oversight, in order to start regulating and censoring AI to align with their objectives.
youtube
AI Moral Status
2025-12-15T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwg8Jrx6FSyQrCvIGd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxTT5et4N_s5sfN1kF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyJU2zFlSEo8RWYZqR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx6q1_FYhuYEcrwCft4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwjEEvCbLkKXQfHCNV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwA2vygTdxuwzz1jpV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwfPvtNUujCHAXxVM94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxXjDmHUxFJAdFNF1x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgykAdOrAKv2MRTnkXl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxGPrUBcUQ_OLhN5Ih4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]