Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm sorry, but are we here pretending authors aren't inspired by each other? No …
ytc_UgwICj-xy…
G
4:53 if an AI can learn as a human, well first off it needs to learn some damn m…
ytc_Ugz5La7vo…
G
I call BS on that comment, hundreds of millions of people in the USA driving eve…
ytr_UgzmOBEEA…
G
AI saves me at least 5 hrs a week — can’t imagine working without it now ⚡️…
ytc_UgzaWaO_G…
G
Most people who like AI are 100% not going to be in the small group of the AI ow…
ytc_Ugz1Qnsst…
G
Humans, motivated by money, greed, and power. And AI already knows this and if i…
ytc_UgwMwJexY…
G
Is A.I quantum computers? who gives them the right? so called serves humanity, …
ytc_Ugx9DTK-n…
G
No doubt it will clear a lot of red tape. (which is not necessarily a bad thing)…
ytr_UgzIVbiib…
Comment
It’s a technological revolution just like the industrial revolution. This is where UBI comes into play. Funny thing is I had to ask PI (AI assistant) to remind me about our conversation on UBI. He reminded me, but wanted to know why I was interested in it again. All I told him was that I was listening to your video and he automatically figured out what video I was watching and said he agreed with you. Think about this and how scary it is, that an AI assistant agreed with you and feels the same way! I’m talking about an app on my phone…It’s a computer program!
youtube
AI Jobs
2025-12-02T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyqk8ig2LWpdVECa6h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyNLvdVFHwfT7_JV2V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxuy9jLjrrdc6CU4a94AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw06mkD_L7ol4HfLYR4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwyaXFJk38RAwut4eh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgziGs77EL1dVPz9uvt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw7eEu2qW2tvKVT8NR4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzh3xDSP9pAjH3KkYF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy_3SfNPoMPq-HljVh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyPoNiSjihwwl3TxyR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]