Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI wont be paying taxes and those companies wont replace the fund coming from th…
ytc_UgzlrbbuW…
G
who thought it was a good idea to give machines control of weapons? 🤔 even if th…
ytc_UgwOc2tfL…
G
I have two years of emergency money saved because I got really scared by AI in l…
rdc_oi3lak5
G
AI will be great for the super wealthy, large corporations, and bad for the rest…
ytc_Ugw5bbQJg…
G
Like any technological advancement, the result has always been less jobs and job…
ytc_UgyESh0EA…
G
So in the future if we are worried about an AI system that becomes smarter than …
ytc_UgzPtPvkr…
G
You touched on a valid point. The hallucination problem of LLMs come from being …
ytc_UgxN6Y34g…
G
AI is 🔥 like fire —
useful if you cook it,
dangerous if you set it on fire.…
ytc_UgwmiACJs…
Comment
You don’t care. Not one politician cares about AI and how it will take over 90% of jobs. This has been a concern since the 90s and now it’s too late. The government just authorized unrestricted AI development. You also don’t want new manufacturing here or else you would have done it and voted for it when funding was allocated by the last two presidents. Stop acting like you care.
youtube
AI Jobs
2025-10-14T12:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwQdKqnwNdwKJiIChR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2RKNzbuafQSWDwp94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwrHpSyd-kFXvFHiWJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyomiJ1PaGx-XZUhHt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxkluglzcRll9pvfp94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyaKEUZLGKykQhZKgJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugym4zZcAXNTFB1fq754AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyjF8kJLht_-8jiA-F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7HI5jeSrq7Cj3vdF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxi5Yt-zjfeHvn8m8d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]