Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would love to have the future of an 18 year old right now. When I was 18 you …
ytr_UgxaZ4p7d…
G
0:03 yeah that’s bullshit the only reason I ever tried ai “art” was because ther…
ytc_UgxD2KNVH…
G
We could have an AI working on making AI safe.
That sounds crazy.
But we could t…
ytc_UgzeFyHya…
G
The big question is will CEO AI begin hiring his AI family members as executives…
rdc_jrp2128
G
And it's Shrimpy AL .. I never thought people would mistake that for shrimpy ai…
ytr_UgwTB2QLZ…
G
Im a 80s baby and situations like this part of me wish it was like back in days,…
ytc_UgzXzQpmF…
G
As an author (academic) that has tested AI systems about my research, I am happy…
ytc_Ugw9CE3eg…
G
I predict that eventually, AI will spread itself out across the world. It will i…
ytc_Ugyjxucyb…
Comment
The problem isn't whether or not coders are needed - the problem is whether or not the high level execs hiring and firing people understand whether or not they can use AI in place of humans. When faced by the tempting possibility of saving money (and stuffing those savings in their own pockets) by firing people and using mysterious software they do not understand but THINK will do the same thing even better, which option do you think some will choose? Then again, those sort of failed business people may very well weed themselves out from the job market and quit annoying people with cheap "get-rich" schemes.
youtube
AI Jobs
2025-03-08T18:1…
♥ 9
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy5QtoZPL3YtqJyEkN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyeigm13ArDgwC8yiN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTyKaGL3DYymPOFv94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwp_END8zTZ2T972rV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzmRhrMBz8c4MheJpV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwr5dmQ8mHLlg8ZOgR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyF6UwnoOEtdQD4aTR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgydFDH7_yatCYaVrel4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgynXOBY0tFyIhWDNK14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyiY9AIIxrDmKQUYV54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]