Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Ai Interviews Flava Flav the moment he decides he needs a clock around his neck…
ytr_Ugyu666C-…
G
Might as well just create AI investors so we can finally kick-off this dystopian…
ytc_Ugw0URCqi…
G
Artificial Intelligence is a Trip!!! One day, Soon, AI will be running Everythin…
ytc_Ugy6M51r6…
G
What's wrong Sophia told the truth believe me 2050 you will Robot's cutting huma…
ytc_Ugzkypmcd…
G
Ok but imagine if we put a chip in monkey or a human to use their sentiency. Tha…
ytc_UgymMTHkQ…
G
Pay close attention to I-Robot. How many jobs did you see in that movie? I cou…
ytc_UgydcLEeu…
G
The problem is not the safety of AI itself, but that it has become a rabble of l…
rdc_l5mqjlu
G
@nayokaldou6251 I didn't mean it like that, Using anyone's likeness is wrong but…
ytr_UgxQu9rMJ…
Comment
I would be interested in Geoffrey Hinton's definition of understanding. He says that current AI LLM's can understand, but it seems to me this is a different type of understanding that applies to humans. This difference is critical. Is there anywhere he defines AI's version of understanding?
youtube
AI Governance
2025-11-15T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyaqfRMCukdrjrNU3J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxDhWWOKeshbFskqXF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx8RObbWalFC-aybed4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyN-adUiCBCJ_-YwAd4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx0vsXOHtrmgRxsqV14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgztSPoJcPrDKLpLpGt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzxpDrrjgrPnPH8AWB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxGcZYfPq7CZxW5se54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzVMIk9fs6d0C6ZCXJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzoMyyuSPwnD_ZMTXd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]