Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would like to ask this question because the topic of AI interests me greatly. …
ytc_UgypaiNF7…
G
"I'm using it as a reference!" - Can you not use Google?? Why reference somethin…
ytc_Ugxz-QRoJ…
G
Ok did anyone pay attention to what the male robot said about taking over the wo…
ytc_UgzCSciit…
G
You don't "copy" an art style, you develop your own by taking inspiration. You a…
ytr_UgzbO2ktq…
G
@LC-mq8iq what happened to reading comprehension.. we are talking about pursuing…
ytr_UgyGCnu8H…
G
The top decision makers seem to disregard the extremely crucial detail of who's …
ytc_Ugyqf1JFD…
G
There is probably also a huge conflict of incentive in how AI is fundamentally d…
ytc_UgzzVgANR…
G
In the universe of stupid ideas, EV's are super Novas. Add self driving and its …
ytc_UgzfaqVQa…
Comment
you know what future of humanity will be with “AI taking over”? HEAL, all of our traumas, relationships, family issues, societal issues, war. We have work to focus on and it is a never ending work, and AI will help. That’s a good reality to create - a good thing to imagine instead of imagining fear scenarios.
youtube
AI Governance
2025-06-25T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy1P_s64nxNuoQlO6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvAt9XKA8-kcQCe1d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxZVd91N5xtdPErOz14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-U-9wKe-l4qHZQud4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgywDRPC6DBfiIdzho54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwfBFqQe2sV-q1kva94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw8Ps-fTu_wUQm45Tl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwtVXv97glMJNcRvWt4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzRU_E1nTltAUCqBz94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxBEu_-7h0G9GXjwY94AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"}
]