Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While I do hate that AI companies and private users are basically just mass-stea…
ytc_UgzC1Q1pD…
G
I’m guessing you aren’t an artist or you’d know the answer to this, but let me b…
ytr_UgybP_iIH…
G
I'm glad to have found for one and that they have invited the guy for two, to he…
ytc_Ugz_M5xHp…
G
Education without human interaction doesn't work and will damage social skills o…
ytc_UgxOYuvGV…
G
Odd. Having been steeped in sci-fi since the single digits, and having a baselin…
ytc_UgwK2EDEt…
G
I mean if the non-bias AI said to be racist then who am I to go against its word…
ytc_UgwBiG6Nd…
G
They are not as smart as people like this think. That is why they have had to r…
ytc_UgxYn39mv…
G
I keep comming back to this video and i always laugh at these ai promters LOL
"I…
ytc_Ugw2EJqZt…
Comment
i thought about this deeply. Since each developer's velocity increases - there are 2 ways companies can go, either reduce team size's in which case openings will go down and salaries will take a hit or second - Re-use the people to ship even more features or re-utilize them elsewhere.
Why i think 2nd is more likely is - Companies need to show growth in revenue as well and not only profit. Sitting on a lot of profit with no plans to grow will drag the stock value down. Hence whenver profit increases they start thinking of new areas where they can use the profit to generate more revenue.
This could lead to either more products coming out from a company, more frequent updates etc.
In case we get AGI, i think it should lead to humanity and companies focussing more on research since implementation will be taken care of by AI.This should increase a lot of job in this place but you might need a phd even before getting hired.
youtube
AI Jobs
2026-02-13T18:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxZVXVIH9lb67aNZfp4AaABAg.ATAZu4h90xEAU36OP5S8qB","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwCaxoDLm7C6vpdcqd4AaABAg.AT9igVJ8iImATApZnsx6la","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwCaxoDLm7C6vpdcqd4AaABAg.AT9igVJ8iImATAszwCXcJh","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugwoh4HQOVoG_0fQMtl4AaABAg.AT9aVFx2UIrATJ8r2y0FbP","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwoh4HQOVoG_0fQMtl4AaABAg.AT9aVFx2UIrATJnwnqqdq9","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyzD3xsQBauZkngTRR4AaABAg.AT9SPLHh_CnATAjbebu5Rt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyzD3xsQBauZkngTRR4AaABAg.AT9SPLHh_CnATBJ2-A4bmb","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwZgkNI23vYe3UBFUR4AaABAg.AT9GQuKqVzIATAd5xZk_x-","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwZgkNI23vYe3UBFUR4AaABAg.AT9GQuKqVzIATAeM14lGZg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxnQoKudpmfK4DtxTB4AaABAg.AT98VYtbfAxATDsbFgrX3a","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]