Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@rpk321 The video title is literally "If Ai Takes All of Our Jobs... Who's Going…
ytr_Ugwedde_o…
G
I fully support Elon on this! Unfortunately, profit maximization is the only thi…
ytc_UgzdyDIlb…
G
I stopped glazing my own art because after messing around with certain AI genera…
ytc_Ugwu1L4Iu…
G
On surface level, a lot of AI art looks good, but if you pay attention, you see …
ytc_UgyaPmb2b…
G
@darkwitnesslxx yes but i would be putting my own effort and knowledge into it a…
ytr_UgzbeR0Hr…
G
Learning to use AI tools feels empowering until you realize what Selwyn Raithe's…
ytc_Ugwr7-7if…
G
This paper looks poorly executed. They're saying that ChatGPT adds formatting in…
rdc_jskabl2
G
liberal sellouts on youtube telling people not to use youtube because it promote…
ytc_Ugz78huPy…
Comment
What about just the way humans are compared to robots human logic, human emotion, I think people rather deal with humans than dealing with AI. I think most people will draw a line. Maybe not the 2% that doesn’t care about anything but just constantly producing an absurd amount of money all the time at any cost. But the 98% that makes up the rest of the world. I don’t believe we will take a stand for themselves, which makes up the majority of the world and I believe they will take a stand until laws are passed to protect them from AI in the future.
youtube
AI Governance
2025-09-13T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwBjyXokjclB9P0KOx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyC9B4Iildx4x1mkX94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzsHrLs4O9CCR_3E414AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxpu7f8tbc6lev_ReN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyyOA0ZjlJMpHWC1L14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyD5ZeCUPXjauompkx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxUNWoCjh3nlPHlc_F4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw-tfBJMl6lNQdRUkB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3Gad_e-AJdeh2oOt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyKDfuh0XoQFlmpeKV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]