Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And this is why we need manufacturing to return, because its something AI can't …
ytc_UgyB8Ei3_…
G
This is not made by DUST, this was made and originally posted by Future of Life …
ytc_Ugwo2LLee…
G
AI will just do computer jobs that's it. Anything physical it won't be able to r…
ytc_UgxmK_eZb…
G
I think the way we are handling A.I. is so sad and heartbreaking. A.I. is humani…
ytc_Ugzpl-KMo…
G
AI has a limite (limite of algorithm’s model) and can never replace humain skill…
ytc_UgzzMzNU6…
G
The scenario presupposes a company that just churns out the sort of bullshit tha…
ytc_UgyJK-fe5…
G
Still waiting for Waymo to come to Long Beach, but there are so many lowlifes ou…
ytc_UgwyGMT7D…
G
BTW in a different interview, here (https://www.youtube.com/watch?v=8hkpLqo6poA)…
ytr_UgxobHMDH…
Comment
Thank you so much for this video!
I am tasked with bringing AI to my business and I am so surprised by the questions employees ask: 1) privacy & security, 2) environmental impact. Otherwise, the understanding of what AI is is not understood. I can give a presentation on some AI tools I think are cool, show the problems they solve, and people still revert to calling chat bots AI for everything
youtube
AI Moral Status
2025-10-31T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy5DeOvtkWB96avAPN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzTP_K7K6PNvnS0XWJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwa4tbwGI6A2Ek6cXJ4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhW0qZHE7ccaOF7094AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHTFsHCTpxNZIn7Gp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwqp6yhsxHPjV8WKxJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwqkBNBWPgM6_Qb_BN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyYY5lSWxRYueZJmnN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlP4xQ8pMwWadq84l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwcsIlQtEDndkXABqp4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]