Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How would I discern even today if your entire report wasn’t a lip sync conversat…
ytc_UgxSk5Gg9…
G
Yep I caught that the first time through - ask permission before you work on me …
ytr_UgwqO9Q9s…
G
I don't like Asmongold, there are streamers who are made with AI a lot more like…
ytc_Ugwy6DPwu…
G
So basically he wants to make a robot to makes humans lazy basically like the wo…
ytc_UgxY2697f…
G
Any one watch the Anime, Full Metal Alchemist the brotherhood?
His analogy of A…
ytc_UgwTiiPQw…
G
my brother thinks it doesn't matter, he said: "i thought art was just a hobby. w…
ytc_Ugxi6tFRi…
G
Much of the money of these AI companies came from the American taxpayers. 51% of…
ytc_UgxI4WLGP…
G
Grok has done that … it generally got upset with me when it thought I compared i…
ytc_UgyzvhhSG…
Comment
I dont know. If AI can handle repetitive and manual work faster, cheaper, and with fewer errors, we should use it. Trying to stop progress has never worked.
But we do need clear political and social plans for the transition. Things like retraining, education, and financial support. The real mistake wouldn’t be AI/automation itself, but forgetting the people affected by it.
youtube
AI Jobs
2025-10-09T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzToEPZMw3OOgPZch54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxmDro5VzpTf2e7Vyt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5WxE6PnrtzIUz4gJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxDVtgylFFVOhBqOnR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxiT2rnJ8xQLGDRGTt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4KUqsmtqplBde7mx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxlYkqDB2ZiUNjzCF54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyz3Lh4wgR7KwA_OVl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyqWGkHP44cXaSy18B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxiRFkX5pvXMsytfEN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]