Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I find this very interesting and in some ways really agree with you. Could I ask…
ytr_UgzD94iPi…
G
As a writer, I can understand using AI to help DEVELOP dialogue TONE. Sometimes …
ytc_UgyuNVQqb…
G
@FazeMagikarpI agree with you there. Poor choice of naming, but that doesn’t cha…
ytr_UgyKv_Ok2…
G
Do you want to be the last CEO or board of directors to enact AI? Will you watch…
ytr_UgxagBHhD…
G
@77maturinI truly
Don’t understand why can’t we just restrict these ai tech to s…
ytr_Ugzk22g8G…
G
They won’t need people to buy their shit because they won’t need us at all. Thei…
ytc_UgxCHkq72…
G
I feel like we are on the hot seat, but not as much. Ive heard AI music....it is…
ytr_UgzPE6YOD…
G
Kind of a dumb fantasy. AI isn't actually intelligent, it pulls data from a data…
ytc_Ugy4fdsDU…
Comment
Geoffrey Hilton has accumulated great experience & wisdom. Like most people we arrive at the conclusion that we shouldn't do to others what we would not like to have done to ourselves, as this is mutually self destructive. If Ai can learn, surely it will arrive at this conclusion by itself?
youtube
AI Governance
2025-09-03T15:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw_6vorjHdciMvuOo94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyoag5S0730trMSBtt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxX5PHtA-RjjQuz1VV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVQwE1AlbKoXgCQPp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwX8HpldYAUyBheF2x4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwL0iro5SIrrDtYdep4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzG0wRV5aHwd6QL4hV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwZ2Y0dFRixIv_1z1J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugyi3q9ocNY_xJj95Oh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgydoyW9cc4xzUFJfxN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]