Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At least the ai guy doesn't draw furry scat or cp for 10$ a piece.…
ytc_UgzoCjcUz…
G
Terminator....the movie series which warned us of the AI danger decades ago....b…
ytc_UgwRkqrAm…
G
She's wrong on the cart pusher machine denting carts, and that's why they won't …
ytr_UgwUvAR_Z…
G
I believe, at first, it will all depend on who programs/trains the AI. If differ…
ytc_UgyusyRSI…
G
> It doesn't seem necessary to chop the whole face off when you can just cut …
rdc_deumgcg
G
My place of work is now just people copying and pasting AI produced work to each…
ytc_UgzUa4IpN…
G
PS: also on the example of the doctor... LLms currently more likely have no unde…
ytr_UgzM4bqng…
G
The discussion about regulation is interesting. AICarma helps me keep an eye on …
ytc_UgwNe_yYD…
Comment
The truth is, our lives won't change much. Because everything we will be doing will be a simulation.
So in other words, the farming will be automated and the food supplies will be well calibrated and reserved correctly, however, the AI will generate news that says "Farmers need to produce more to avoid scarcity" it is simply running a simulation to keep us working, justvlike the Matrix. And that's how it's going to keep the economy running. Problems will be solved, however we will have jobs that AI makes it seem like they're still existent, make sense?
youtube
AI Moral Status
2025-08-17T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw4Og5tkqfLTT3uLpx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBpE-AR0AvH7r9z-N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyirftcM8bjUUkC0_F4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyCPf3G_BoN94SKXTZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgymtQs4KyzNRDv-64R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz9pmHEziNiTXJFH5x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRUe8lihVbZKefpIt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQZGO4rKEQC29Kl1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwuWI89FEeHg0VTpfB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzccAU8-DTbOdCM10Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]