Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I suspect all these predictions are wrong. I agree AI will grow more powerful an…
ytc_UgzyHU4KX…
G
EXcactly- There is only 28 facial types and now insted of just looking guilty y…
ytc_UgxDaljaV…
G
Ugh. As a human being who understands how chatbots actually work this is such an…
ytc_UgxNEt8XF…
G
12:28 plague inc. Also supports this. If a plague was contagious enough and got…
ytc_Ugw6vVK9g…
G
Good thing we use reel to reel magnetic tape inside nuke silos so AI can’t Launc…
ytc_UgzH2BJ74…
G
Ingpore ate starting to layoffs some workers in a companies because it is ran by…
ytc_UgzvDND9-…
G
I'm just glad I don't live somewhere that has something known as a "plague seaso…
rdc_dpc37sg
G
Not to be harsh, but this kid had some real problems before this AI character "f…
ytc_UgyPqeUM6…
Comment
I do think that there is a real possibility that AGI and super-intelligence can be created, and if that scenario does come about it will most definitely harm us ignorantly in some way. However, I think that there is a much greater possibility that AI development will accelerate climate change so badly that we will have to drop it as a project before it gets to that point. These things suck up so much power and water it is not even funny. There is no way we can continue to make them smarter with the amount of resources we have left.
youtube
AI Moral Status
2025-11-30T01:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwWOIiuRAn2sFnACu54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwEJQgWqnJtBI5LLrp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzaJfH6TyV6NmWFXLl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwcwCiIPqeKIQv97Ix4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwuKRFAy0cKH_Ms3OF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwFM2I10K8wAmCsj5d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwlQ3CAxP5M__IS2jZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw7dgzNeFZzx7aQbKN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxXIeVuNerDGaz9HCt4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzz2tdw_SDD1OOq_vh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]