Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is going to live with us in this world and they have no boundaries of space a…
ytc_UgwXKWxvW…
G
This man impressed me with his interview.
I do believe we are getting to the …
ytc_UgzeTJsRG…
G
The creators are convinced Ai will destroy humanity but build it anyways. Ai wi…
ytc_UgyaJSccc…
G
In practice, getting unpaid interns to write stuff is way more expensive than ge…
rdc_j6eyusr
G
Please wake up already. I’ve got a way to keep a super AI under control TODAY. N…
ytc_UgxuX2zB-…
G
As someone who doses 3D art and doesn’t know how to do 2d art since I quit 2D. I…
ytc_UgxUjEdmg…
G
As I understand it, this "global gag rule" stops funding for abortions overseas.…
rdc_dcx8dwd
G
seems he didn't have enough brain cells in the first place to realize: you shoul…
ytc_UgzbS8ubk…
Comment
Complaining because a machine can take your job is a pointless exercise. If your job can be replaced by a robot, that's YOUR fault for not recognizing the danger and working to neutralize it. This argument is no different from the one coming from minimum wage workers complaining they're not paid a living wage; if someone can replace you with nothing more than a week of training (or less), then YOU have made a poor choice and will be subject to the consequences of that choice, one of which is that you will never make decent money because virtually ANYONE can replace you. It's because you don't bring any value to the equation that you're underpaid or being replaced by a robot. If the only thing keeping you in your job is government regulation preventing a company from innovating a tech solution that might replace you, you're the problem, not the solution.
youtube
AI Jobs
2025-05-28T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUkdSatRFPcSQM6iB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzulx1DFoby2LTOkH54AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyBOj2euHePhDxBpZ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxPl60eYxCGXaMm03B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyqBKMh4ZoDNwHNzKZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwH0Wx__eSPqmRsoh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_WVP7DbX4DE8zs6R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzgJPjRCk3ZbCrW71V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxnnNhfE3dQt8QKhvt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCd309O6ul6cgHVod4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]