Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's a fear of automation. It's only natural.
Weird thing is though that it's in…
ytr_UgyaqoaKS…
G
This is what Moshi AI itself told me: https://www.youtube.com/watch?v=_UPDWqWAp4…
ytc_UgyW7xgi7…
G
I’ve found the best AI’s (sonnet 4.5 etc) is actually useful and doesn’t just ch…
ytc_UgxpF70gh…
G
Hope the entire AI bubble crashes like anything and all these companies who are …
ytc_UgwVwfsNi…
G
This reminds me of the Star Trek episode where they poisoned the Borg. Guerrilla…
ytc_UgxTCfNe5…
G
@CallmeBoBthe3rdartists used complained about photography until that became rec…
ytr_UgyXYg3F7…
G
No but for real this will lead to the collapse of civilization. Humans will pref…
ytc_Ugxd9_mih…
G
Thank you for illustrating the last point, comparing digital art to traditional …
ytc_Ugy3I0eOn…
Comment
Hasn't Tesla's autopilot already show us, even in the basic stages of autopilot, that it can see & respond to tricky situations/crashes/accidents WELL before a human driver is even AWARE that things are happening? Wouldn't it have seen the destabilization of the boxes many seconds before & started lowering its speed / figuring other options?
And wouldn't a self driving car also understand to keep a healthy distance between it and the vehicle in front of it? The video showed it trailing the truck too closely for proper driving safety.
youtube
AI Harm Incident
2017-06-23T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjp-pcf8PXx8HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiIjTXIJ5B6wHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgicjfJMB8sNk3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugghpi3fGQwm63gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghvDWbrWZGpnngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugjd8L8jNpzFtHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghC4VdT2HTcDHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugg4Bv06oKkXP3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghZnL5q_KLZvHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggPbviiDUEtwHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]