Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's like how films and games are made for adult babies. And making the AI safe …
ytr_Ugwr_oUKa…
G
I'm more pragmatic on this topic. I think people will see what's happening and p…
ytc_UgzoHML2N…
G
Video is about how AI isn’t working out and at the end of the video you have an …
ytc_UgztRMMw0…
G
@a@alienzenxareholders and investors need to believe a company is both “cutting …
ytr_UgxVl7SX2…
G
You assume that everybody works in an office.
Who is going to mine the coal, dri…
ytc_Ugxui3QuG…
G
My local DMV already use facial recognition so I doubt Amazon is doing anything …
ytc_Ugxw340wJ…
G
Everyone at my job would make the same joke about how AI ones that learned how t…
ytc_UgwHbUW8V…
G
We not maxed ai
Corporations is to greed to wait
So it more like greeted out ai…
ytr_UgwKoary6…
Comment
Really solid breakdown. The “AI reliability tax” is real — we’re seeing it too. For founders, the key is starting small: one painful workflow, measure results, keep a human in the loop, and scale only when it’s actually saving time.
youtube
AI Responsibility
2025-10-06T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzl0O_xyjzvHi9ceUB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrxWdV56yB-pzzoQV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgybV2udL0niqpMvepV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKgwIjouK7CSjhK7V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwR4Vzl8FgLyzNc9kR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNUBWrI3tASfJIjc94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxhbNoFwCcGJbNDz6R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5UrqcBU4jX4GOlt14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyT0Z7NvQ1rvs0e5714AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVTwOYFRMfntGHzNB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]