Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To me it just looks an increase in sharpness, contrast, and motion smoothing 🤷♂…
ytc_UgyxvcP6a…
G
Artificial Intelligence will work as long as nuclear power countries not decide …
ytc_UgxjUC4NE…
G
yeahhhh no.
art is a skill bro just pick up a pencil.
EVEN IF YOU CANT DRAW
I WO…
ytr_UgyOPNdOB…
G
This is all false about the AI, it all depends in how the users choose to make u…
ytc_UgzmBMiWy…
G
The agent crushed setup then fell apart on integration, that’s exactly how AICar…
ytc_UgzXvISYo…
G
Just remember:
Even if AI designs systems and infrastructures, someone must app…
ytc_UgxVZli74…
G
I love tech, specifically hardware, & this garbage makes saying that difficult, …
ytc_UgyZfDk8i…
G
I hope so. Being a CS rep is an awful job. It’s tedious and people are verball…
ytc_UgwcSO3Nd…
Comment
Medicine is based on algorithms, but people are not. The nuances and subtleties of interactions between patients and doctors are essential for obtaining accurate information beyond what the patient initially provides. This requires trust, the art of medicine, and a human touch. While AI will undoubtedly assist in these processes, entirely replacing humans would be unethical, especially given the current dynamics of ownership by companies and concerns about patient privacy. When AI is involved in a situation that results in a patient's death, questions arise: Was it intentional, an accident, or a malfunction? Who is liable? These concerns will likely delay the licensing of AI in the medical field for a significant time.
youtube
AI Harm Incident
2025-07-21T04:5…
♥ 75
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxgUlUf_zIYuHla4T14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxvtSOLZoJHe54V9-Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzhY0d_AcMRcWJmL4N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhBiaRV9E2qG6jmnR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxqrIvKAiq0gEOiU4V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzMkLNEuWCWQJKdV_p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzh_UQXWTwn8ohHswd4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvNC_nmcji5QQ4imR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwfCBjf5AHE8ka9ei14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwpGwvdtW1dNE_tZ7N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}
]