Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Universal income might be a very good idea. This is because AI is going to cause…
ytc_UgzUtXh20…
G
I’m using ChatGPT as a substitute for therapy/ bothering my family with my endle…
rdc_my67nk9
G
Maybe Christ will return as a super intelligent AI. Then the people who get chip…
ytc_UgzAH7-TB…
G
The only people taking the side of AI are huge greedy corporate overlords. These…
rdc_mpktg9z
G
AI can the 6 blocks that can be connected to the Ai chip both can work with one …
ytc_UgxsceFZO…
G
Im a person who have argued with 3 ai “image prompters” and they are more braind…
ytr_Ugymmd917…
G
This video is great, i really hope ai "art" is not something that will be taken …
ytc_UgyTnpwRH…
G
They’re only talking about what it takes away instead of what it improves. Are w…
ytc_UgzbS8u0r…
Comment
@katehamilton7240 Why did you stop.
And don't ask AI about the risk of AI, look up the works of human AI safety experts like Stuart Russell, Rob Miles and Geoffrey Hinton. They are all clear that AI the way we are developing it right now is an existential threat to our entire species. The researchers at AI 2027 such as Daniel Kokotajlo a former OpenAI developer says we are almost at the point of no return. Hence the AI 2027 name.
Geoffrey Hinton the "godfather of AI" and Turing and Nobel prize winner for his decades of contributions to AI R&D believes some of these AI models are already conscious. And has said they are rapidly becoming smarter than us and will learn how to kill us.
The AI revolution as now being developed will be for the benefit of AI not us.
youtube
AI Moral Status
2025-08-17T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgymKUxTQvRsRPBLKOx4AaABAg.AItS8wjfmhOAJYVaapToHc","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzUXHPpLN4hGUiQ-uZ4AaABAg.AIq67DJoPDVALYDZNhBgsq","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzUXHPpLN4hGUiQ-uZ4AaABAg.AIq67DJoPDVALYFoM5mOw_","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzUXHPpLN4hGUiQ-uZ4AaABAg.AIq67DJoPDVALtLFtxjDtM","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzUXHPpLN4hGUiQ-uZ4AaABAg.AIq67DJoPDVALuUKifpNF1","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxdXusJM5JguuH_hJB4AaABAg.AIoF6zbCq-9AKADjplvz4R","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxdXusJM5JguuH_hJB4AaABAg.AIoF6zbCq-9AN-4ir-fbAr","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgxDGLuInGOzKP0w6lR4AaABAg.AImVtyMULi0AItNUGzJCkP","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxDGLuInGOzKP0w6lR4AaABAg.AImVtyMULi0AIuATGXccAE","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxDGLuInGOzKP0w6lR4AaABAg.AImVtyMULi0AIuuueyLQJg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"}
]