Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate ai as an artist, but I almost fell into its trap once when I was just sta…
ytc_UgybMveKe…
G
fun fact: ai art isnt stealing, its piracy. it doesnt take the pixels from ur ar…
ytc_Ugzhx-I_W…
G
So how will we survive if robots do our jobs. The people like the f wit in the m…
ytc_UgxXCaOoi…
G
Yes, there is a noticeable pattern: a significant number of key founders or tech…
ytc_Ugx8iQpre…
G
So if corporations such as amazon, taxi companies etc decide to use AI to replac…
ytc_UgxibL3Gf…
G
U did better -also an artist who almost gives up :(. It’s ok tho! U r very …
ytc_UgzT9kD30…
G
in the future AI will make obsolete all those professions which should not exist…
ytc_Ugz70w7O9…
G
L' AI menace des millions d'emplois, est non ethique, incroyablement energivore …
ytc_UgxQRntgt…
Comment
Wow to give $1000 per mo to someone that lost a good job to AI seems a slap in the face. Nobody can make any end meet on that. The companies that benefit from NOT paying salaries by firing people and using AI in their place must be taxed or penalized to the hilt. There has to be an AI use tax or something. Or a de incentive for creating world poverty via skyrocketing corporate profit.
youtube
AI Jobs
2025-12-12T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyD_pT1SPbEORWBR854AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyyZyZxSzF_AkDt2Mp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz1sMxigyFppQGnF9N4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4Dcxb6MVCk6q-VKh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyvPM7SAdCAPp0eRIx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNnWJIvd025-fy1w54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy181EU7sTWU4BZQ654AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw1mOmdvcaLBKLalA94AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzmDSWGnOQJ3DFpLrx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzKhQcKz6J1R3wX8K14AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]