Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use AI to write parts of a book I'm writing...Still, if anyone told me they'd …
ytc_UgwnQ8jGo…
G
I used to think this but now I think they either know the planet is F&()# alread…
ytc_UgzJgBVqN…
G
I'm a mechanical maintenance fitter, so probably the least likely profession to …
ytc_Ugz2yRfQH…
G
Saying Musk has no moral compass and also letting us know the news channels he u…
ytc_Ugw64MI23…
G
If it wasn't for that human factor the world's super powers in somewhat recent h…
rdc_dwwd83m
G
We t f,#£@kd m8 3rd wave rvlt!! Ai wins &😢 8:18 will dominate mankind o viously…
ytc_UgylCfHIf…
G
AI sucks, it's always sucked, it is always going to suck. Art is a form of vis…
ytc_UgykIryUz…
G
i mean, fully automated vehicle exists, it is just a question of are they good e…
ytr_Ugy5KlxOE…
Comment
This is what GREEDY human billionaires, investors and bad PEOPLE do to create data centers like this for AI. It’s not AI itself and there are many different ways to create ECO friendly data centers that are better for the environment and can also help power the city it resides in. Donald Trump is allowing billionaires and investors that only care about money to create data centers like this for AI and bypass clean energy laws because he doesn’t think global warming exists. These billionaires have the money to create eco friendly data center options but they are just not choosing to do so. That’s a HUMAN flaw of power and greed. Not an AI flaw.
youtube
AI Moral Status
2025-12-04T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyQvKyP_d4y5wWvrIp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwSQxu_bPDR8_6LoWR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzzFP_6sgdX6E5dA-Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxrj_GJr0vh9TAvwlJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw8ck-6Jz_cZQA1r914AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymPTTA-vMKiwnbriJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwJg_e5veYOXQdKpK94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwohcxUwbZQjgXqnTR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7DYEzoYlFwFBQFOp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzIqPmvFJNb5F7ZMbl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]