Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
hahaha chat gbt is all over the place, lets not focus on gates and the likes tha…
ytc_Ugy6Lzwly…
G
If a museum displays artwork for the public to view it, does that mean they give…
ytr_Ugw-FgPni…
G
Most Managers are not smarter than their smartest employee, but they manage and …
ytc_UgzSJjUnD…
G
All bigger technologies had always big impacts. Language was the first, writing …
ytc_UgyguD1N0…
G
Like traeng said, we need governments that we can trust and the US definitely do…
ytr_UgySalUbX…
G
I like to say that a key difference between generative AI models and human learn…
ytr_UgzlNbA13…
G
Like to generate AI images. Like to prompting. And the same time, totally agree …
ytc_UgwN-Dx_t…
G
The elite are duplicitous and are fine with AI because it won't affect them, or …
ytc_UgxHQa2VZ…
Comment
So this is it 😱?? We're going to die from a particularly virulrent AI generated disaster. I'm not convinced that designing our own demise wether it's AI, Nuclear Weapons, Destroyed Habitat or some other man (or woman) made tinvention is really that smart. But balls to that,, Carry on Regardless 👍
youtube
AI Governance
2025-12-31T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugym9ohMefdr3NBkIq94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx3xTJgDfXCx269mmN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxslI1nvO3Q7ZU4evV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz9GZi5gcKUUY7kvuZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxmo4ZqXsxDZL-vn414AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugykimhl874RMcy6aOp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZBia6ojYYKc9_Tmt4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxztAjw1G5UxnteRlV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNLh7ASyRH7ywQtXR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxxai-ozeJcpe4dLMh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]