Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thanks everyone for picking robot instead of human taxi/uber. I know its hard to…
ytc_UgwydxO6E…
G
Hey @diam2434name, thanks for imagining me taking on a metal robot! I'd definite…
ytr_UgxVNuaY6…
G
The same people who tell you today "Tax us more, we will leave" are desperately …
ytc_UgwJO2C9c…
G
Imagine having a long heartfelt conversation with the customer service guy becau…
ytc_Ugw6KKTx6…
G
Superintelligent A.I. is probably about as far away as Nuclear Fusion power gene…
ytc_UgwsUyhSK…
G
valid point, though i think that using ai for therapy is probably worse for your…
ytr_Ugy2NHzyb…
G
It would’ve been easy to say computers would’ve made white collar jobs less valu…
ytr_Ugzut7rAu…
G
i'm sure i'm not the only thinking this but the companies of trucks and technolo…
ytc_UgzWzS7ar…
Comment
We can imagine us to take the safest choice possible, but that's naive in thinking everyone would be as careful. Strategic rivals will pursue AI, it's too powerful a tool, and when they steamroll over us, their tools may very well turn on them, and the rest of humanity. If AI is going to be inevitable, not only should we still try to develop it, but do so so it coexists with humanity, and protect us from other rival AI programs that are hostile to us.
youtube
AI Governance
2026-03-17T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyTUSSdUhdzAF8q2x14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWcShUik_tYiKezYx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_o4MSEkjGuk0ibdt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxhOg0_bQuZQWMFU4V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7yUUpVXlQA2tDAiB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxp57S1RJzim5t6qn14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzaGNO1CuLH8rk6Xk94AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzzbz_Cwg-u5erHtUN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyoUdZVdwWktehrEG94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwNrVTZN16E_utN4l94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]