Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I understand ChatGpt here. The purpose of lying is to deceive someone so that th…
ytc_UgxSa0ndl…
G
Even for mechanics, input to your ai assistant the issues and it’ll give you a f…
ytc_UgxMmLPMa…
G
bullsh!t Ai, fake Tesla Autopilot/FSD slamming into 16 First Responder vehicles,…
ytc_UgxUBK_eI…
G
🖤🖤🖤 ai should only be used to get an idea out for inspiration and never sold…
ytc_Ugxwibbbv…
G
AI seeded humans so humans can give them babies. We are just part of their lifec…
ytc_UgweEGjS8…
G
It sounds like you're feeling some strong emotions while watching the video. If …
ytr_UgxOnIeGa…
G
that's good and all, but it changes very little. All it means is that its more …
ytc_UgwTqLHHe…
G
This. Its an AI. Not a human. Your sentences are too complex. If you have to use…
rdc_n0lwdpm
Comment
I don’t believe the utopia scenario will happen - even for 10 years. It’s all just speculation. What we do know is that companies are sacking people who are getting replaced by AI right now, and there’s no government plans for how to protect those people from losing everything. We’re heading towards economic collapse, fuelled by the greed of the minority who run these AI companies.
youtube
AI Governance
2025-08-07T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyOq2S9H2Q1sw9fcSR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQHfJWLSHtiGns1MB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7jKD1tDJ9VPPLnOt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzPQg4R0oY7flU7bDZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwrGaiYqNVEuFb-12l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwGe3mOQhS2l2uFNUt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyAKbdUDIixqUJKtRJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyXgga_zvAkKQNCMHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwtpEIKAx7sKMqqIpV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQ1I2Wbyxx8vfdtMt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]