Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You hear that AI will take away 50% - 99% of all jobs. It’s true that automation…
ytc_Ugxld-BCg…
G
To be honest ai tech ceo need to stop promoting so many things ai can do. Is tru…
ytc_UgwMm7bBX…
G
Not disabled, but I don’t understand how can one see an ai fart as an art tool f…
ytc_UgxS7IXlt…
G
Did you just hear what that robot said that she's going to dominate our race I k…
ytc_Ugx6Uhu1P…
G
@BirdsofAccordA problem is that technology changes so fast now its hard to stee…
ytr_UgyyzMz_E…
G
Why not just find out where the ai chips are being built and you know ... we all…
ytc_UgxNrSCVv…
G
This sci-fi AI vision is pushed too fast too far from being a scientific reality…
ytc_UgxBCREkQ…
G
+Hannibus 42
Well thank you for this productive and smart comment! I am sure t…
ytr_UgzcjxsTW…
Comment
I dont think i agree with your later assesment. Yes, ppl do stupid things. I remember hearing abt a woman who drank a bunch of soy sauce to detoxify bcuz of someone writing a made-up Article online long b4 A.I. that being said, the onus of responsibility does not rest slowly with one party. If i design a product that has a lethal design flaw if just the right person accesses in that way, that is still my responsibility. Mr. Sam saltman literally considers those who x to it as 'testing' the software to improve it & billionaires who see human life as disposable & a means to an end are the last ppl who should be at the helm of that kind of tech.
youtube
AI Harm Incident
2025-12-21T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxszPO2B-l6oyu6sdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyliISL1WW5q6jhTuF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzUPFVgqH-rteUWzc54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2_lU-Shkm57zx6rx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyc5PNEYwKVnO0J4qp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHkHlN1WjMPZ52HhV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxYAn-0DeTUlbZAzpF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzNdV6Ud8AWTaRCpvN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyP7s11q_AdjufX9RZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw9R337HLZUIigkVPx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]