Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
they seem to think people make art because they feel a need to, and not because …
ytc_Ugz2Q0N2v…
G
Chicago is the worst place to use AI with politics. They are so inhumane they sh…
ytc_Ugx0QmG0q…
G
@jiffylou98you are looking at this on the most shallow level of comprehension. …
ytr_UgzyGijws…
G
As a professor, I have NEVER used AI for anything. That would just be lazy.…
ytc_UgyAwV8aC…
G
As an aside, student loans would not be calculated in this study as they do not …
rdc_d7krwyi
G
Even I’m smart enough to not invent something that can kill me. Turn that crap o…
ytc_UgwdErKH9…
G
You meant the vehicle that saved the world is out to kill person an AI does not …
ytc_UgzYA_EBC…
G
This is great timing. I recently had a conversation with ChatGPT about personhoo…
ytc_UgwndL1Y7…
Comment
The analysis on whether your job will be replaced by AI is dependent on one thing (that Roman doesn't clearly lay out). The preferences of the demand side of the market.
If you would prefer to receive therapy from a human using AI as an assistant, that is likely to be a job that is conserved. The market will be transformed, and people with less resources will probably opt for a cheap AI therapist but the point remains that if people have human preferences vs AI preferences, those preferences are what will be met.
If you wouldn't care about being driven by a self driving car given it's cheaper and you don't have to deal with the risk of a talkative driver or smelly car, this is what the market will transform into.
youtube
AI Governance
2025-10-12T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwA6IxJVAwlqyz1OyR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxNbKR6sG4PDXpuoLR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4YtpnhQeMGKFtNIp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1NKUBoF5KK4WCC8R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzLbKF-p16A0L6ye_94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_KTxp_ZtiMfTYhsl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxBAR9_YlSpTiO2X5l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzxBbchGyWvZYZ4KJJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyaHh07RzqhF8QlQbJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzIjoXYcDTuDJyMR414AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]