Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI simply doesn't think like humans do and because of that it can't be inspired …
ytc_UgzfmpQwh…
G
I'm not scared of AI I am scared of psychopathic religious billionaires controll…
ytc_UgzNE_awi…
G
Sorry but, is that any different from whats happening now and always has? Its ju…
ytc_UgyEaqF_R…
G
KAMA_THE_ONE Alright, let's rephrase it then. Artists can create art, and make i…
ytr_Ugx54SeS7…
G
I found out a weird thing abt chatgpt. First of all, say tell me a joke about wo…
ytc_UgwTjrU_x…
G
If you truly are deluded to think chatgpt is good enough for taking over jobs at…
ytc_Ugwuc8tQU…
G
A.I. trying to calculate our next evolutionary step and then implementing a plan…
ytc_UgxTPte62…
G
if ai art makes him feel artistically satisfied why is he nearly crying when arg…
ytc_Ugz4bBWOf…
Comment
AI has been used in the medical community for years to cross reference all current medications available for side effects. The purpose of this cross referencing is to implement current drugs for patients' diagnoses that have not been responding to any of the typical medications prescribed for that condition. Rather, the drug is used for its side effects. It takes 15 years and approximately 15 million dollars, often more, to develop a new drug, so the safety records are there, as clinicians wait for new drugs to be developed.
Also, I have doctors, literally Google questions during my appointments. Not really any different. AI just compiles what others have input into it.
youtube
AI Jobs
2024-04-20T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwflHHtP-pece5mKRx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyIViff7y7rzEwXwYB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwjN3diKje_XIuWoK94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyF6VJdpyhDNzFsotR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwB4be1eWtiAFoch2d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7dA7vNRY1m76qk_d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzj0AuApOz8kah0hNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwy9fgiKs_kESqSpXZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzeX2yb6T2rGybxeVR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwDkVPBfA_1tZfk5TV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]