Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI learns just like humans, then they are missing one crucial aspect of human…
ytr_UgwFbg1ru…
G
Because you have the ability to engage in abstract thinking. Something a machine…
ytr_UgyztM8op…
G
Programming is here to stay...for AI ,however highly skilled & talented pool wil…
ytc_Ugydn5nFt…
G
Whenever we reach the point where energy for datacenters is too expensive and ta…
rdc_n80enhf
G
@ifyouonlyknew666 A lot of smart people are working on AI R&D. The media, howev…
ytr_Ugy9zTr4N…
G
So they need just 500 million people in the world now, rest will be killed thru …
ytc_UgzwXQZu7…
G
If they win you could take their entire source code as research to train an AI t…
ytr_UgzgiRof4…
G
Since tesla had their entire source code stolen by china, it makes sense that th…
ytc_UgyvSKiyd…
Comment
Hi. You have to look at it from a business perspective. Why would they want robots/AI to replace jobs including radiology? Because you don't have to pay a robot a salary. A robot will not try to sue for any reason. They won't compare the robot skill level to the human radiologist. They will compare the salary of each. It's all about making money because hospitals and clinics are business also.
youtube
AI Jobs
2024-01-20T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgygUgqPLmS03U3UJGN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnmBW1oW7Vl32uDuN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxrqSdQpOniQ_kZTtN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwrpT3gwHKc4nPpl3R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzVXndMvLf6sXCP_-J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzwVbg0Zu8g3JpeEyJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyYHeY-3OKHRV1twXF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQf3jtpEdotO-JJlh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxc8WN6H2pwtdTr68B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoTVjXr823zJoydXt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}
]