Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
3:29 no one talks enough about the damage this is currently doing, for example:
…
ytc_UgwDXVZpr…
G
Maybe a robot can be an artist, but its manufacturer will still not be an artist…
ytc_Ugy2O0M_o…
G
Its precisely because I understand how AI works that I am not afraid of AI takin…
ytc_UgyoScBtz…
G
Mmm… we’ll have to agree to disagree. The problem will always be a human problem…
rdc_kvy3yg5
G
If humans get to stand around twiddling their thumbs (before evolution weeds out…
ytc_UgzaCYgjs…
G
What is particularly disturbing about this interview with Sam Altman -- who coul…
ytr_UgzoNOaDr…
G
Being activly using AI within enterprise software integration, I can say there i…
ytc_UgwIaFVa7…
G
So many of them say, " I consume a product and I don't care who made it," but wh…
ytc_Ugxt1geTA…
Comment
Geoffrey hinton : '... Google have not been irresponsible actually the opposite... '
Geoffrey hinton : '.... 99% money going into AI development and 1% into whether it might be dangerous. It should be more like 50/50 i believe ...'
Yours is a vital voice sir but what you just said there is the definition of Google's gross irresponsibility. How can they maximally ignore the world's first super intelligences potential to cause harm and confusion?
It don't get more irresponsible than that sir.
You actually agree with musk assessment of Google boss without realizing it .
But am very glad that you are sounding the alarm. Bravo.
youtube
AI Governance
2023-05-12T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyQFjvcVySTrGpiZmd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgybpcFZyDvT1AkGt7d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgweoR673754y1Nq8XV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxm9l5ThKXak3yY4E94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4vLdlhiZ071_M_7x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwWln2q-5Espth8lGV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwxKd4-Z67OpBLBQIh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxwioSmyy8w5l3QQJh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwuAAw-ELBW2iwu9694AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwlH4vtNnvxpM7dllB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}]