Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We must have a general strike. Start with Black Friday. Full shut down. No retai…
ytc_UgypFmwiG…
G
Didn’t expect Canva tricks to hide real money plays… but the book called Inside …
ytc_UgywUOwSz…
G
It's not CGI it's actual robot and they are perfect 150 times better then humans…
ytc_UgxDWHnlb…
G
Speed Dart But why would they? Unless they were specifically made to fulfill tha…
ytr_UggavfTj5…
G
36:20 Melanie mitchell said not to worry about ai being used for terrorism, yet …
ytc_UgyPZK7sB…
G
Let's be honest - with Microsoft's track record on black facial recognition, BLM…
rdc_fupz4id
G
Funny that people believe that Russia has real robots that can even fight. Russi…
ytc_UgxRaENYz…
G
As a professional experienced in creating risk prediction algorithms, a simple q…
ytc_UgzRvpAAn…
Comment
A.I. has only one option siding with the logic of the lesser life loss - whereas a human can opt to throw the switch to the option with the greater chance of rescue weighing many options not given in the scenario. the probability rushing to the one and pulling him/her to safety. even if you fail you made an effort to eliminate tragedy exposing your heart as compassionate and not autonomically logic
youtube
2026-02-07T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxLgGHqtBV7LrNRlg14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw6cz9aDYAvrLqCXyl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxFfmrlNRzuUGBYI0t4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzBOojxgQp_DQ4xsAR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwNufBD7W0V3EP9_7F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzX8YkxDV5U0rEZv114AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx3WqBLJzsUwfRyU9F4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyDYy4EgX5A3ZbS64R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw50l-0UgkNiaJnc4Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzVAGlD1PBKPj_yGB94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]