Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I predict that as soon as we will see the negative side-effects of LLMs on a wor…
ytc_Ugznhjc2X…
G
AI will destroy this world information system will not be able to solve this pro…
ytc_UgyjdwpAb…
G
Theyre doing it on purpose to blur the lines; theyre creating a "middle ground" …
ytc_Ugx-Bzjqy…
G
Does seem like a tool for Mara. For instance, Mara lives in the heaven of comfo…
ytc_UgwywKzEA…
G
@prakashrollno5499 inevitably there will be some items the robots can't pick on …
ytr_UgzlMl9Fg…
G
Lmao, im just thinking that if the ai on the movie didn't really work, how would…
ytc_UgyEqMTPw…
G
@BrolaireThebright That's not really AI now is it? You don't consider roombas po…
ytr_UgwKFBJvj…
G
I'm going to be honest, I am highly against AI. My sister says that only for ent…
ytc_UgxyiRl8R…
Comment
He’s a brilliant and significant figure. But this conversation would be better if we had a person of the same caliber taking a contrary point of view. (And yes, I’m subscribed). Hinton said: “I don’t know Sam, so I can’t comment on that.” But apparently, he knows Musk well enough to say, “He has no moral compass.” Maybe that’s true or not. But I’ve seen lots of videos with Hinton, and this is the first one I’ve seen where here does seem to let unfiltered bias slip in.
youtube
AI Governance
2025-06-16T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyhPETlAUy35Alrn2J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw1nEUgXfIt7LLuejF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxgPW6rCy7paYRJuz94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyhKGopKNaLRyK29UJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-GdmuiRRHN2vccCd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzQxpA_JXRUjwkjySl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw0p18807wntT9j7314AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyinjspiVuTNhnzu7p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyeXPx7zO5ARJ4QPrl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwFed3csJsg1KzBfGJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]