Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI takes over virtually everything, please tell me what is the point of sendi…
ytc_UgyannqBU…
G
We found the sane person in every comment section. A AI is made to just mimic so…
ytr_Ugy7V_DN_…
G
By harnessing real-time data processing capabilities at the edge, farmers can ma…
ytc_UgzJsARrG…
G
It’s just repetitive. I don’t really care if it’s human made or AI. Plus, if you…
ytc_UgxVr_b6_…
G
The answer (that this guy won't tell you), Theo - is that a few will become very…
ytc_UgxN-yuWo…
G
I love how humans need each other, to thrive. But countries, governments, corpor…
ytc_Ugyv0iEr8…
G
Just tell everyone out of a job to buy a couple shares of GameStop problem solve…
rdc_gkq8vk5
G
And i'm telling you RIGHT NOW!
That "woman" RIGHT THERE IS NOT REAL! AND I DONT…
ytc_UgxGPJShm…
Comment
people need to calm down with the AI panic. AI will take care of the tedious work, the memorization, even part of analysis. but for as long as your end user / patient / consumer is a Human Being, you will ALWAYS need a Human Interface, that's what future human jobs will be. a patient can always read a diagnosis and goodness knows how well they understand (or don't understand / misunderstood) what they read, but it's always better for a patient to talk to another human being who understands what a written diagnosis is saying--hence the Healthcare Professional. also machines are machines, it's not human, machines doesn't understand what it's like to be human in a human body, that's where the added value of the Human Being Professional is gonna be badly needed. and don't worry this isn't limited to Healthcare, it applies to many existing industries (Technology, Commerce etc) where you have humans dependent on one thing or another.
youtube
AI Harm Incident
2024-05-31T15:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyeRIUzWibLNMrEnQd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyx60GwAmPsAACY1-R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxHlgO2QKiD6XKFUo94AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwFuP_2V-b8gy5017d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwyTaQbGoahOm5Q5RV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyBlBOAfDnOr4_wim14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9b7IQyWfxsPsVeWF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6Nf-jAVJkaIWrX2h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyKpp4tSrs88CGZ8zR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx4rC1BorCZ5sAmW814AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]