Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There's a weird correlation to people supporting AI art and being a bad person. …
ytc_UgzeFtqQF…
G
I feel like nobody is talking about this, but it's pretty obvious AI art is doom…
ytc_UgwOqaSFP…
G
Fox news upon AI existing for more than 1 second and posing a 0.0000000000000000…
ytc_UgzyCAGAE…
G
Ok but what's with all the glitches on this video? Quite fitting as John is talk…
ytc_Ugz6n-1tA…
G
Yep, they need to watch Terminator, The Matrix and iRobot! It’s a game right now…
ytr_UgzXs9l2H…
G
@ImMcGlovin ah yes... those statistics which definitely havent been edited to ma…
ytr_UgwyhBBPU…
G
Actually, with AI, that Jaws analogy is possible. As we know, AI can still be ba…
ytc_UgzChtP7v…
G
you gonna phuk around and find out that because everyone has the same access to …
ytc_Ugw61Gjiz…
Comment
Speaking from experiance, you try to hit no one.
This is the problem with these philosophical polychotomies; they present you with false scenarios while demanding you pick one to be true.
When you're swerving out of control,
When you're slamming the breaks and swerving to your left or right, no one in their right mind would think it's OK to hit the car next to you.
To really make this a mute question, the car should be programmed to not follow so close, just as the human drivers are programmed to give large trucks more space,
While in this example, the automated car was tailgating.
Modern philosophy... ehh.
youtube
AI Harm Incident
2015-12-12T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjY5ZbRHpZbl3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugg6vkyHWXADQngCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgiE3qm0bdtqengCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggVGd5tRkaKZHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggsLujeKwbCNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ughj07npbLjXPngCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UggP4ePx319A6ngCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugj-9FzhtV_B43gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghfJlHACEBRgHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Uggx47tC_oo6mXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]