Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That is the same model used by Meta, for one, where links off site are depressed…
ytr_UgwXoWFVa…
G
At least in the matrix they need us alive, this is way more grim, this is that r…
ytc_UgwQADV22…
G
hehe, the next 5 years will determine how the future is going to look like for H…
ytc_UgygA3vsI…
G
I don't feel like that's the complete answer here. I worry what's going to happe…
rdc_lgryg7h
G
Ah yeah, the "elon musk fears" that drove him to create the most unhinged ai of …
ytc_Ugx8fB15w…
G
As someone who hates ai, I honestly have to agree, I've practiced for years and …
ytc_UgxA-mTRB…
G
Can you really judge a Robotaxi system that is just getting started? Ps: there i…
ytc_UgzjFfrrO…
G
Yea. Didn't take long time of using it in actual reality to realize that its no …
ytc_UgxIr8x1o…
Comment
Using AI to help with diagnosis and to suggest various treatments is an amazing idea. How many people have died from a doctor prescribing something and being unaware of a drug interaction? An AI could help to avoid those mistakes while giving more detailed analysis for the doctors to look over.
youtube
AI Harm Incident
2024-05-31T16:5…
♥ 44
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugydut7gRuUSpcDD7Qt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcEpRQ-CZ0fyIktXp4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxPiSiWj-O2QsuiQ_h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw2VMygv9EGzk0tgid4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFAfYqHm4RowZey2t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxujPAAka2q7HOYER94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXHrPKQw5ot92xnvR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwoRMU4neec6QGWJIl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwCf7v0utqAApG2ekB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_GALp9O-msg41hIZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]