Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No To AI survalinace is for prison not citizens No to Hell on earth the Devil …
ytc_UgwEoWEv0…
G
AI data centers are bringing hundreds of billions of dollars more in capital inv…
ytc_UgxnU1vlJ…
G
If the output only derives from the input, the output can no longer be comparabl…
ytr_UgxMKaU-H…
G
Boring people: *uses AI to generate their art or images and amalgamated animatio…
ytc_UgzHT7Zkj…
G
To some degree, we know who is developing AI, even if there is not anything clo…
rdc_je4mn99
G
AI is going to undoubtedly cure some of the most incurable medical conditions hu…
ytc_UgzNsrbGr…
G
>they would have had over 2 billion in profit if they ~~were not paying fines…
rdc_czlkukv
G
Your videos are always great. Sometimes scary af, but always educational and in…
ytc_UgyK0_bZ8…
Comment
i agree and been thinking on global level on aboout alot of things for years now even like before RTX by Nvidia i thought why dont we train robo's on GPU environments in 2006 - when i was 6 - with most the things this man have said but not with not creating super - intelligent but i get his point Ai is not dangerous itself but if its in bad hands and carelessly developed can make an extinction level event possible. ( we always wanted God to be here , When we about to build God haha now we don't want it ) its the brain's Survival instinct in a way too i think.
youtube
AI Governance
2025-09-04T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwqyyAg7hojpOE7T8d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzg0RDKcchO_qMxV2R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwBpAB98ziVm1uyIlN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw00YseQVfeqWsUTGx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxPxmE0ZnoUELSWH9d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyYPeNjTZu-47jbSKh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz1AklAzMlhkPyHyyl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWiOehWJ6SRpM1zyF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxMgYq_dMxtoA2h4Sp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwRkNu0wWZh0sRsS_R4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]