Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I honestly do not think AI/Robots will ever become concious . It just dosen't se…
ytc_Ugz9UKQeX…
G
Ia was first meant to reproduce human’s work now we think all is made by ai 😢…
ytc_UgxDjIoA_…
G
What a crock of shit. Even chatGPT thinks it's true because it read it on the in…
ytc_UgwMpDX-R…
G
So how did Ai know that having an affair is considered immoral by most people?
W…
ytc_Ugy26w05W…
G
then consciousness needs to be a priori... which is absurd since AI would requir…
ytc_Ugw1_w-cc…
G
@oliverhardy9464 There is no "ripping of". If you share news you have read onlin…
ytr_UgzPWCu1w…
G
What an incredibly well thought out conversation....your degree in MechE shines …
ytc_UgzsCn8En…
G
We are AI ouselves, I believe! Just look at how realistic looking games are look…
ytc_UgzwlY3EA…
Comment
While people who like UBI, like those in this video, see it as a solution to AI-driven job loss, they REALLY ignore the dystopian reality of a 'Basic' lifestyle. You can see this in The Expanse, when Bobbie visits Earth. (It's a good peak at what could happen.) UBI doesn't provide freedom; it provides a poverty floor that acts as a ceiling. It traps individuals in a cycle of subsistence where they have enough to eat but no path to contribute, no way to move, and no hope for advancement. We shouldn't be looking for ways to pay people to stay out of the way; we should be finding ways to ensure human agency remains central to the economy.
youtube
AI Jobs
2025-12-25T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyWzyl2MWz7hmNIpst4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIBSfSRuVkiauHB-94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxoU8fbwLxRSnEkHYd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwzYrxLRgoZT9YrFKR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx42F72TxE4sbd8djN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyOJVwbZ6V7c9QebzB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxl7a2x0SRASsyA0ZF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzrbqzZBYy4LZ5nT-N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzkjilaLdAytx5uBwd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyKU6EgfaEKDuvm8b54AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"}
]