Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People will just be cast aside by employers and the bought and paid for governme…
ytc_UgzAhqR2n…
G
Irony: looking to governments for safety. Governments, titan corporations, oliga…
ytc_UgxRMlkPW…
G
Agreed, AI makes a LOT of mistakes.
Also, when Jensen Huang (CEO of Nvidia) say…
ytc_UgzlP3Xm1…
G
Oh great. One of the first things to build into AI will be pain. That's sarcasm,…
rdc_dy5r4tq
G
It's not just the massive investment, the energy to run AI and water (for coolin…
ytc_UgzMOiNM8…
G
What if humanity didn't create AI rather ai made humans aware of a larger singul…
ytc_UgxUXU6n9…
G
Nobody asked for powerful ai models it's just killing jobs especially the fun cr…
ytc_Ugyp272u4…
G
This is just to scary here( replacing humans with robots ) This is a damn shame …
ytc_UgxiUHzf3…
Comment
If 40 of people with jobs lose their jobs to AI and robots without a lifeline we can assume most of them are men because 67.6 percent of the workforce is male.
We are talking about fathers with families to support, bachelors who had nothing to lose but their jobs, suitors who need an income to be seriously considered, home owners that’ll lose the property they were working their entire lives to pay off.
Listen there are more big guns in the USA than people.
The real thing to worry about is the extermination of people who’ll have nothing left. If you think the government sincerely cares about human life more than their fat stacks then you drank too much koolaide.
youtube
AI Harm Incident
2024-12-17T18:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyhZbphqFwl9LApV9d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyzMUYOHC5gt1SKwdZ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw9VgQMFCxdEYFznG94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyLWgg3EEqEejXcvp94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2xW2oyQcY0Tp0zPR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxWx-LMjCUQm-o7swp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwlvckmRIFBfB6Ab5J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugz6RNFLfb5eolvTYiJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw3oeSzl24EXzqlFlt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyOgwkC6483VHliC014AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]