Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Before AI gets to smart can't we plant a system to where they can destroy the AI…
ytc_Ugymt1TfH…
G
We, Humanity need Pro Human Laws. We need to outlaw 😈 AI from taking human form.…
ytc_UgxRLltMR…
G
High quality education for all is one of the most important issues of our time! …
ytc_Ugy7eXWqj…
G
AI is programmed, how can it possibly be intelligent? That is also impossible, b…
ytc_UgwZQ_vf-…
G
Musk has no moral compass. He bought the US election, gained access to confide…
ytr_UgyYH4TEm…
G
So human combatants can retrofit the weapon if it's autonomous frame is compromi…
ytr_UgyK3_5Xd…
G
Well it's 2026 in a month. The AI engineers are nowhere to be seen. Because LLMs…
ytc_UgyjTOkSi…
G
The irony of my first view of this video being preceded by an ad for some AI pro…
ytc_UgyO-eLTl…
Comment
In my opinion… I just don’t agree that AI is “opening up” new jobs. It’s taking away jobs rather than helping to open up jobs and helping people out with jobs. I just don’t agree with that. Specially because the people who are losing their jobs to AI are not getting helped with their finances. People need to work in order to keep their homes. To feed themselves and their families. Without work, more people can and will become homeless. That’s just the reality of it… and yes AI can be helpful and work well in some work place scenarios but it shouldn’t be replacing people..
youtube
2024-04-26T03:1…
♥ 48
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxM-W8lE_0eCirikT14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzgphPQJIW5fHj2BgV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz86CgaDnfLxzWIzZ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgykTL7m3nCzKIKe5XJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyJ5bNLKqESC3YDoCF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw49MlA0HCY1saPtaZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxLfc3u6E5I_f4Q0NR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxG0P1GKUHa3edfSsF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"disapproval"},
{"id":"ytc_Ugxz0S2OC7YFftVaw0x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyCvVnPDMZnnIf8s1p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]