Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
She became disheartened in Silicon Valley upon seeing first hand that self inter…
ytc_UgxXneDEH…
G
I would not jump to human error so easily. And also doesn't seem to be an AI iss…
ytc_UgzykjV9g…
G
Ik this makes for good content but this requires gargantuan amounts of energy to…
ytc_UgyX0vbIU…
G
Automated refuelling for road vehicles had been possible for quite a long time. …
ytr_UgzXHe3ew…
G
Facial recognition doesn't look at someone and say yes/no.
It'll look at your…
rdc_eueatq9
G
Doesn’t sound like the ai is racist or sexist actually, it just sound like peopl…
ytc_UgzEqkr1s…
G
The biggest threat to humanity from AI isn't a Terminator scenario, its the econ…
ytc_UgyaS9qeb…
G
there are companies that want to use AI, but in a smart way. They will not "tort…
ytc_UgxcsJVMP…
Comment
It mimics cognitive actions from human behavior, but lacks all "physiological" processes... our brain is not only "thoughts and emotions" it is an autonomous self aware (efficient) walking lab, intrinsically connected physically/historically and socially to the world - One brain is many times more complex than the whole web with all your AI together that, in addition, has gone through a millenary process of adaptation - what about all human brains together? humanity - the most complex system in the universe. We should be working about how in the heck we should come to popular agreement and start solving real problems in the world rather than wasting time, effort, energy and lives in smoke & mirrors.
youtube
AI Governance
2025-07-09T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxU2bWA457z-QMpgvZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx4zzloMA9x2hVbbfF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxq2hJdasqFtSf20254AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxYPDFeGPjEVi72wwJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxiHsIXSroI9pIBIoh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzNVCDhZlG2JmGJaiF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx80ZqjtFaPu1rArIl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugzt5AGoDN7iibkaHRp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzFiOCm4Ap4bo9dKSB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz_OlHWYtfvWH-48fN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]