Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
what kind of person needs an ai to summarize a text message for them? a text mes…
ytc_Ugzk0OsAB…
G
Fucking, exactly my thoughts! It's like, cool that it's something that's possibl…
ytc_UgxdrG7g5…
G
1:02:28 again, Soares is wrong: yes, there is a cult-like silicon valley subgrou…
ytr_Ugx9u1McF…
G
I have started learning CS recently (self-study, but without relying on AI, sta…
ytc_UgyBqpx83…
G
Same, especially when capitalism kinda already threaten workers with such ideas …
ytr_UgwCFCpuG…
G
AI is at its dawn phase, it will better itself, maybe close to perfection in the…
ytc_UgyiOlwZM…
G
A message quite long yet so dumb, "If we cannot steal what are we supposed to do…
ytr_UgySzBaob…
G
It is easier for non creative person to think that AI will replace creative jobs…
ytc_Ugx7hyC63…
Comment
And do what? Rot while the rest of the world advances and the US’s products become unbuyable in the international market? We need to be pragmatic. I do support the idea of a UBI while laying off workers, as this avoids a crisis while allowing the US to continue to compete. You say we should have the AI like you know, be the thing for the people reduce billionaires bla bla bla (im writing this here so you know I’m not a clanker), but well, what specifically? Do we ban AI? You suggest giving workers a 20% share in the company and a vote, well, if this aligns with your previous rhetoric, it can be deduced they would elect to ban AI. As I said before, this has drastic consequences. Hence, I find a UBI (universal basic income) as an optimal solution.
youtube
AI Jobs
2025-10-08T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxOS7wKQWWhB4ChXFR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},{"id":"ytc_UgzJXzjFpkPoc0DIp-x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgyiXRwcT-XKyomuB1Z4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwia6jNaoM1GF-PoHJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxQA5EVrV-tTr-jgSl4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},{"id":"ytc_Ugz-Usp_w_Qyi9eJyOB4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgxfkH_VQ80LZTCUU_N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugw2MzAgHLZxobHvpi14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugzf6R7PQdVicnZbHOd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},{"id":"ytc_UgwLOc0IMjtMxjTlzvR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}]