Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nice "first attempt" but I would emplore you to learn how other developers are u…
ytc_UgwpBRuUL…
G
AI can’t have a motive…
…means & opportunity alone are insufficient: no motive …
ytc_UgwGX0hD_…
G
Can’t slow it down. It’s here for good and we need to embrace and enjoy it.…
ytc_Ugxd0Eo-3…
G
We need laws that will regulate companies to only give technology Ai and robotic…
ytc_UgxDods4o…
G
These ‘experts’ have missed a key issue with AI: the value of the human interact…
ytc_UgwCP7qyK…
G
Human Artists are putting sweat heart and emotions into there arts but AI doesn’…
ytc_UgxPwyHmP…
G
The difference is you still have to buy the microwave and the food for the micro…
ytc_UgybgRSRd…
G
In Sweden, there is a shortage of several thousand truck and bus drivers, absolu…
ytc_UgzfkFbjG…
Comment
I think while we are trying to make sure AI is aligned with humans, we need to make sure corporations are too. They (corporations) tend to operate as an entity or agent with a goal to make profit, cutting jobs that benefit people when it is possible to improve profit, but they don’t need to make those cuts. If we can’t make other humans care about human good, we really are doomed to fail with AI. (And I’m an AI optimist/enthusiast).
youtube
AI Governance
2025-06-18T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyNDaMihHTfrWQYcyR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx5qJa7IurRDJ8ujvl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxRmc_vVBa6mF1VkGJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwRc0KScCrPt3IRz0B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVdCzvTeDWY06KFYd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxfUN3U-h9wu80xxxx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzwiykLyKNOfc7YrF14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxWCVNFS5BOWbV8GQB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyfrAxtt3xx8IqX_IN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8e8lgvSR7XZ32XEx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]