Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@kotorandcorvid4968 na u ain't all that, ai already leant now its creating its o…
ytr_UgzfDjyx_…
G
I definitely think that one of the things that will come off the impending AI bo…
ytc_UgxyAomzq…
G
I’m now learning about this “artist” with this video and I don’t think I even ne…
ytc_UgwC2Y_bX…
G
Not an illustrator,but putting my feet in music production and I think that AI a…
ytc_Ugw9E6NlU…
G
Let me play the tiniest violin for him.
He used stoled art from AI datasets to…
ytc_UgyGdQuM5…
G
In Teslas misleading 'statistics' the highway vs everything is just top of the i…
ytc_Ugze2053b…
G
AI is not like internet or mobile my dear CEO guy.😂
AI learns by itself. It will…
ytc_UgyXFgUQY…
G
I'm not a big AI image generation fan, but I think you are glossing over one of …
ytc_Ugx-wbFFJ…
Comment
LLMs seem useful for some things, so long as you stay alert for the hallucinations. The dangers seem to be when humans anthropomorphize them, mistaking what they do for sentience & over estimating & venerating them. The LLMs provide the most likely next word based on having 'scraped' their input. When/if image engines replace artists, they will start digesting their own output, resulting in feedback loops that will make current generation 'AI slop' look relatively normal. Many professions will have to work with AI, resulting in employers demanding more of them in the day, leaving no time to check that the AI hasn't hallucinated something life threatening. AI will manage insurance for that, so if it kills people, compensation will be paid out
youtube
AI Governance
2025-09-25T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy0tOANkJ7_ndwrFXx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwr4l4S-fLqNOGW0Oh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgybmG8fpQ6I4nLDnyV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzwiYwYpoC45Vv6jkh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyWT87VP5qFxpvLlLp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgznwlIytPdPI_IyuQZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzxgkGjfOK7uoa996F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfT0J6GHAUYA0ZE194AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw0-Z-qKhC2yAqRkzh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnF0WoNniDe3onnud4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]