Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Mark 8:38, “If anyone is ashamed of me and my words in this adulterous and sinfu…
ytc_UgzJhM64t…
G
This guy is a freaking idiot don't listen to him. He's speaking out against AI a…
ytc_UgzRMoQZv…
G
@angsan_F
I mean, at least automation almost perfectly replicates human de…
ytr_Ugx6FgT2p…
G
shannonxpennywise " Artists are proving that we do not need AI art"
- Did anyion…
ytr_UgzFck5Yq…
G
I have a few arguments to set forth
1: People assume that the models will be u…
ytc_Ugz-Gh575…
G
So is this part of the big ugly bill that says states can't mess with AI for the…
ytc_UgyVTYkbb…
G
You don't need more jobs be created when automatization replaces people. As this…
ytc_UgzWPtRdl…
G
I generally oppose AI for driving cars, because people need jobs, and these days…
ytc_UgzdCq2DG…
Comment
It is not far that we should start discussing the universal basic income even if it is just 1 rupee or 1 $ for month ... as automation happens this numbers should be kept increasing. Looks like this is the only way to keep this cycle of economy will be able to keep running in the future.
youtube
AI Jobs
2025-10-23T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyjLagI7sg36BB2boJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxYQsebEIavuCHhwR54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxpkfURyBo6WNsArQh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxxdMXdmBxMUgl2ktV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugyzl39GdayEY_B9pCd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJFrSWrsNlG7x7O6J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPAMLa-3IZplxRnDN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwgWzucSpUTjtql4GZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZQlhOdXFr87QWnGx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzuUGAioZkBrfFByHJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]