Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You missed the destruction option of manipulation. Humans a pretty stupid and ou…
ytc_UgzjkPQfa…
G
Ai is clearly unethical. Just fine the companies trillions of dollars, fine the …
ytc_UgynDHaoM…
G
if AI can design,develop, test and deploy... there is no need for an app/system …
ytc_Ugz0pnDU1…
G
Maybe this is why they are so desperate to colonize mars, so they can escape the…
ytc_Ugxlqyyxd…
G
How does mankind, even today, treat “lower life forms” that threaten its surviva…
ytc_UgyO8ZH7I…
G
They are AI’ing jobs. Actually Indians, Artificially Intelligence, Actually Impo…
ytc_UgxaM7Fo6…
G
Look at how we created our own pandemic with COVID. Worse than that we punished …
ytc_UgyYbvWwM…
G
Emissions caused by training AI models are negligible compared to things like he…
ytc_UgwgkiNLF…
Comment
People picking products WILL eventually be replaced by robots? Why? Because people are a liability. They require vastly diverse resoirces to keep them going. With robots, all the money spent on people for things like 401k matches, subsidised health care, workmans comp, and unemployment contributions go away. Instead, you have a workforce built of technicians who maintain and repair the robots.
As an employer, you dont have to worry aboit all those warm bodies who have personal problems come up. All you do is have backup robots ready to go when one robot goes down. This means maximum uptime and predictable performance out of your picker workforce.
youtube
AI Harm Incident
2024-09-17T16:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgytuZszsAi8h2fJBfN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyRuEu7us1IRA7PdzB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGl-drU_13J5lqBwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzL-4-LuR-2RQw74wN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzeq9qOXlw6ewCh-b94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxUS1V1ljLWUa9v_vR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyfenfDyMnLylFmfuZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwLJIZMx2aPhPeGpqJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx7wSpxoQmk37kLH554AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx1v1GP2xvROdQOBLl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}
]