Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
BTW: the chatbot is not having a subjective experience. It is reframing a langu…
ytc_UgwymEOge…
G
AICarma's insights have guided me in adjusting our strategy to align with curren…
ytc_UgyKSs0FM…
G
There's no such thing as AI safety. It's a ridiculous concept. It's liking sayin…
ytc_UgzqcNP8y…
G
When the apocapise is there and Ai has an IQ 500+ I have already my code word f…
ytc_Ugz5Nmo45…
G
Elon himself is out of control, so how can an evil AI be a good thing when it's …
ytc_Ugy_3paRZ…
G
Alternatively, you can use AI to generate some random cool looking digital art w…
ytc_UgzJhPZO3…
G
AI is not human, it should not have the same fair use rights as humans…
ytc_UgymJ03ec…
G
If you have not read/digested/summarized Dr. Asimov's "I Robot" series, as I am …
ytc_UgyFr0R1W…
Comment
@JeremiahJames You're right that automation is inevitable, but my point was about timing and motivation. Higher wages at amazon aren't a long-term commitment—they're a temporary incentive to fill labor gaps until automation catches up. If labor costs were lower, amazon might have taken longer to push automation aggressively. But when costs rise, it accelerates investment in automation to protect margins. So yes, automation was coming, but offering high wages now is just a strategic stopgap, not a sustainable plan for human workers. That's the trap people aren't/where not seeing when chasing the dollar.
youtube
AI Harm Incident
2025-07-26T14:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxYQfNFTA-NP7IL8F14AaABAg.ABumvQsoUobAEzwcl1pbPU","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgzJgkLU-q3TObOYC2F4AaABAg.AKkyhLASmFFAL2HCL_9Mnq","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzJgkLU-q3TObOYC2F4AaABAg.AKkyhLASmFFAL2JytAfIUp","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwcOa5f5TeUphHYDJh4AaABAg.A5P_p61EXFJA5g1lWFZEl7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwvNIxFcybqCkBsWhR4AaABAg.9wPaqEm4lPpA0cAKTPrfJe","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwlbZ0piEdYHSTLIyZ4AaABAg.9wEJVSZ5Dxk9yHTDXfOBAX","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgyY5Yb1PKvMIsXq0id4AaABAg.9wE8CwxBtYaA5STQqvgaBR","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyY5Yb1PKvMIsXq0id4AaABAg.9wE8CwxBtYaA6AKdBr8kC2","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxTBjkvuDa6oMcTfMd4AaABAg.9wE7mPtefaEAJqCu_lqXlQ","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugx3SHRZpXdE6lPDOst4AaABAg.9wDvN5XzqiPACGNdo66iLL","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]