Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@jablot5054I asked AI this question and this is what it said.
The concept of U…
ytr_UgxLMJXoH…
G
So it'll end up being consciousness bearing bodies and unconscious super-intelli…
ytc_UgxhJtX6j…
G
Even one political candidates campaign has used AI to create fake images of Trum…
ytc_UgwOQjh1u…
G
Thanks goodness these two are not in charge, or we would be living in a Stone Ag…
ytc_UgwycBKEh…
G
I can't help but wonder if Asimov's 3 rules would prevent any of the coming disa…
ytc_UgzyUu5A2…
G
@Reverend_Veritas yeah, real nice work discouraging people, old man. You don't …
ytr_Ugz61I77F…
G
Yup, took several college writing and communications courses but now when I writ…
ytc_Ugxrbfaq_…
G
This isn't a problem we should fix about the goddamn ais. It's something we need…
ytc_UgxxmYpXb…
Comment
The default trajectory is highly dangerous. We the general public can do something against the unchecked AI race, like pressure our governements to enforce the big AI companies to regulate themselves, and put much more of their compute and R&D resources into AI safety.
youtube
AI Governance
2025-07-02T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyCLBEJAZHeRl4oaYV4AaABAg.AK-wckndrYCAK3sdOPNRza","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzMwA-2As4laPuRoVp4AaABAg.AJz3mGo7qjuAK598EIrC3o","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyENOH579RxGauK-Fd4AaABAg.AJxQi3GpjsJAJyNJyA82pX","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxGcR7_R7Qvr8MeJN54AaABAg.AJxDAzLcnKoAK53ZbbSD8N","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzQ4G3PwmHBfIFbOoV4AaABAg.AJwsdfgS75tAJwtX_TBG5C","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwRzp1RMgsGaXXIHYR4AaABAg.AJwRrAGjn4UAK4EOuziUyG","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxZT6LuDvkTW0mRrNt4AaABAg.AJwL-E2cvILAK4FBe6qO1x","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytr_Ugz8kXJ54R54LGC4pDl4AaABAg.AJw8sFdpsn3AK4L8pyijZP","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytr_UgxuKqS5SR4qTNHORi94AaABAg.AJuhcfLiaW9AJv0uOzmZHv","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyCLi-a9qrVIhXsxQB4AaABAg.AJu8hia4-F-AJv5cYcGsPA","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]