Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The difference between standard abstractions like game engines and AI is that ga…
ytc_UgyY7n-N-…
G
the question is how long till we reach AGI or ASI, if it takes a long time then …
ytr_UgwM2_DLw…
G
I don't think the AI malfunctioned when it comes to becoming MechaHitler(even th…
ytc_Ugyzh1sG7…
G
If I made a robot and it said that I would blow it to kingdom come before the ne…
ytc_Ugz8jC6JF…
G
> but isn't it the job of governments to set and enforce labor laws?
At leas…
rdc_d3rr5qx
G
The rich are going to be the ones buying stuff we are entering the age of techno…
ytr_UgxT5qHmC…
G
It's a waste of money to go to school at this point. She's just trying to not be…
ytc_UgwZ3CxnG…
G
You can’t know when artificial intelligence will become conscious, just as you k…
ytc_UgyEWFh6u…
Comment
This is fine until it hits the governments in their pocket, forking out billions for unemployment benefit - and there will be no succession plans for senior staff, so there will be a huge knowledge gap as the young won't have had the early experience to build their competence. This AI will last a while, true, but once these factors start, there will be legislation - if only to ensure governments still receive taxes rather than paying benefits.
If no-one sees this, they must be wilfully ignorant in planning for the future.
youtube
AI Jobs
2025-07-30T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxOZFVYo1Zu5OHoP8F4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxl3ubRBWb1pPBlL9t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6bGM5-1P95Cskmz94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxbf57FrrwQAL-RO9x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyi6XUhIdaW9vbQAGN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwSet3Z0WzcKSIfC-d4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzz-ZhJTL3RcDH2AW94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzi-yTq7Rfh0HNmBC14AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxaLmmuisYh1MWUXaB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwt5pY-L7GZXoXMF_p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]