Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
His point is that we should get on the neural link train while ai is in its infa…
ytc_Ugw0K-vlB…
G
@Lira-j4gthat's why they're working so hard to create Palantir's spy network an…
ytr_UgwZYGXMM…
G
A.I. is going to ELIMINATE many jobs in many fields, not just trucking. Any job …
ytc_UgwaG3CyP…
G
I bet this quote is an AI hallucination that this dude took at face value.…
ytr_UgzlgCckB…
G
Isn't it goes too far!?
I mean.. living with ai makes life easy but isn't it rea…
ytc_UgwDlhdjF…
G
Blue blood my ass my friend drew stick figures with those goofy eyes but now she…
ytc_UgyXUGwXy…
G
why tf his lips lag / have lower fps, head movement feels like ai geenrated…
ytc_UgyRU7kZZ…
G
If my sibling acted like Shad acted about art around Jazza i`d just up and hit t…
ytc_UgyKCYLn7…
Comment
So 35 min in Hinton says Elon Musk doesnt have a moral compass, but when Steven asks what about Sam Altman does Hinton pauses and answers he does not know 😂 what ECHO Chamber has this professor been living in .. Elon is the single greatest proponent on the dangers of AI , and Elon vehemently wanted to keep open AI a NON PROFIT .. Altman saw billions and went for it .. this wanker Hinton talks several times about the dangers of AI in For Profit corporations (as legally bound to make profits" he states) , exactly why Elon did not want open AI as a for profit and he understands the risks to humanity, as does Hinton - yet Hinton thinks Elon has no moral compass, wow what a total hypocrite!
youtube
AI Governance
2025-09-29T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxDphRRIpVFYVsv3q94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzOOs43BFWmbgMFd54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzRklTpdRcr-j-RQQx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweTT3x1hocHuHGTCJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyrmxjwg5AyLN4HFyN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxl2vBOaw8c5AAqN2J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwf_OoEELlzpyTFVjZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhAmX8ul0ZwE3Df6p4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx4i7muG0X6Pth7vNJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzMELLnzI7OEvN_Yy94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]