Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> the unskilled can get reasonable facsimiles with little effort.
This is th…
rdc_kjks68q
G
Im so sick of people who work on AI warning us of the dangers... it's your fault…
ytc_Ugyemc_gx…
G
I only ask this as a normal question, what if it was the other way around? would…
ytc_Ugw_6dFPZ…
G
Europe sorry to say is crazy, not realistic, they regulate what they won't have …
ytc_Ugx1EnMGy…
G
We should not invent robot because in future it really destroys the world please…
ytc_UgxEr_S91…
G
Terminator, Eagle Eye, Avengers: Age of Ultron. All hypothetical examples of the…
ytc_Ugx8VnJfU…
G
Ok I'm going to quote an actual AI artist on this as to why he makes ai meme mus…
ytr_Ugwmf4nCu…
G
So AI has been in development for decades by the smartest people on the planet, …
ytc_UgwJOUn7p…
Comment
@UkrainskiyPatriotAnd when it doesn't destroy humanity in 2027, it'll destroy humanity in 2028. Generally, as a rule of thumb, AI will always destroy humanity 2 years into the future.
youtube
AI Governance
2025-08-02T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwVzDm8ap5vWffKrVx4AaABAg.ALJD6cTfH-bALJT9f7-4qD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugx2SQ-wm0GB_upZIgt4AaABAg.ALJCevWQQdEALJIKrny7mw","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx2SQ-wm0GB_upZIgt4AaABAg.ALJCevWQQdEALJLDWEk_st","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzlrri0Ub65ZsPmwj54AaABAg.ALJC2lK04NaALKhNwPvewU","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzA3n5NASrflWNTtXB4AaABAg.ALJ7cLpOsYOALJeKT3MPbC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzA3n5NASrflWNTtXB4AaABAg.ALJ7cLpOsYOAT9K-oV6QO8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwh67naOc2ygDRVjwl4AaABAg.ALJ5p-v8fXaALJNnvE_5Mi","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwh67naOc2ygDRVjwl4AaABAg.ALJ5p-v8fXaALJecxEVlSA","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_Ugwh67naOc2ygDRVjwl4AaABAg.ALJ5p-v8fXaALKM0N___Je","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugyu7xRe6JIIrB60WEJ4AaABAg.ALJ4gsXX1P8ALJR12yr5wC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]