Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What if robots actually _want_ to have a machine do hard labor rather than havin…
ytc_UgyGrQOGf…
G
That's not the goal of this study. We're talking specifically about members of C…
rdc_fvw6ymh
G
I'm an artist that uses ai and have found it was really helpful actually getting…
ytc_UgyFc-Z9D…
G
And of course so many people are underemployed and barely scraping by already. T…
ytc_UgxORgxyI…
G
I'm a Data Scientist and I find this talk very well explained. Laymans will unde…
ytc_UgxhQdcBD…
G
At least OP got nearby 1000 upvotes... if someone wrote a few years ago that thi…
rdc_n3lfeeg
G
So essentially he’s talking about sentence or at least mimicked sentence because…
ytc_UgyND_4_X…
G
This is absurd and one of bizarre news. Why use self driving technology while it…
ytc_UgxbVYViI…
Comment
Legislatively, "...human judgment..." is the referenced authority. The Bible says: "There's nothing more desperately wicked than the human heart.". The human ❤ does not logically is not equipped to govern AI. Most human beings have depraved intentions; that added to mental illness exponentially magnifies the dangers to humanity, with specificity.💕💕💕💕💕💕💕💕💕
youtube
AI Governance
2025-03-23T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwNEVpazCRB9LjCGYZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx_BMhSBu-Fnm2GIRJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyreosASiU9oNdjOAN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxPDBd5FDi72bO9sMR4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxkZXYtj9o5NkNdind4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzZq-akoFf_ao41t794AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwfP69lhl210BRsMzB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxOBY_rjFi7pzo4FsF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwXxHNfd5C8jSAex5x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw8qqY5NFKDRuOoYnp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]