Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But do you not understand the implication? Those tests were carried out in contr…
ytr_UgwRZ-keM…
G
So i started to use character ai... I think it's supposed to make you sus…
ytc_UgxfFmwMC…
G
This isn't a good idea, there are already signs of if these AI's developing huma…
ytc_Ugw6v1-S6…
G
i’ve been using GPTHuman AI for that, makes the text sound natural and gets past…
ytc_UgxSc48TI…
G
If anything, I believe that this AI revolution will only do good things to human…
ytc_UgwF2WpH_…
G
Art styles change, if you look at the first video she uploaded on here and compa…
ytr_UgzYabhWl…
G
I mean, if the general populace lose their jobs then AI fails as well as it woul…
ytc_Ugzq9cgGL…
G
I feel like you may be missing a key part of the AI debate, that it's not just t…
ytc_Ugwu80lwT…
Comment
I recall a quote from a USA Republican senator called Lindsay Graham from about ten years ago: "If we nominate Donald Trump for President we will be destroyed. And we will have deserved it".
It's taking a while but I'd say they're on track.
Similarly, people close to the A.I. industry see the danger of developing it. It may well destroy them.
But the wealth and power that they will gain, at least temporarily, overrides their doubts.
youtube
AI Moral Status
2025-12-14T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwrpdrDOfHaZBp8O6p4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx4I35W9U7RlmY8YBN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxU0W6Da9Y0tgbHW954AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxL-nkc-afSp1B1xz14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyuiQUgr1wmTJyO60Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwBinuRs4jPiEzII3N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgzoAp_puThclzl04S54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPFFzi3NyoJnA5OVt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgywbI-FUG1Bu3CjruF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwuYx5ksMvaHBA1niF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]