Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ok clearly this guy hasn't heard of speedrunners in gaming iam a speedrunner in …
ytc_UgwCdUi95…
G
On that missile question it got wrong, turns out ChatGPT 3.5 gets it right. Tha…
ytc_UgxUveA6f…
G
It's not easy to replace customer service because human problems are complex whi…
ytc_UgzX_GAmO…
G
Copying AI made takes time for Humans but AI copying Human takes a few seconds u…
ytc_Ugz1QSGAH…
G
Seriously, a red herring?? This is another in a long list of mindsets where many…
ytr_UgwO_pZWb…
G
Only idiots believe in self driving will actually take them to their destination…
ytc_Ugw1aMN5e…
G
Or just don't use generative AI since it steals things from people, doesn't give…
ytc_UgwSHMcbB…
G
Why does no one give simple examples of how ordinary people could be affected in…
ytc_UgzUadrAK…
Comment
I highly recommend reading The Singularity Is Nearer! It talks about automation in the job market and how even though in the short term it creates uncertainty and job loss, technical advancements this far have created more jobs than they've replaced. What we need to push for from politicians is the taxation of AI use by corporations when it's being used to replace workers, this money can then go into funding retraining programs for workers for a more steady transition period.
youtube
AI Jobs
2025-09-30T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwNhyVdd7QuVecwNp94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyHCZN4D3whTClRK4F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugy_2ufgWElVSgvv3-J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMAIC9HjI_9PqZ7254AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugymgm6664_uYqu_Y0F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgywofBodMsFze5mH_94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzPWcmrboeFGPAArzh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycV4j9zdUNb9L7ylV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOXINeHujgGrDGS2R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgznPkK1tmD6_r7NVgB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]