Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Great point sobering ai for projects that solve human conundrum should be a prio…
ytc_UgxvA5oY2…
G
I broke the filter once, I’m the innocent one.. it was on a Lucifer Ai where you…
ytc_Ugw8q8f7m…
G
No man, I please don't ban AI / robotics weaponry! I want to live in the Termina…
rdc_cths9iq
G
Bro that was clearly an error. Listen to the response ‘ ok, i will destroy human…
ytc_UgzN5urTI…
G
Can't blame them. This was actually a stupid decision on Biden's half cutting t…
rdc_mfg3v0w
G
Question: How can an AI mimic real emotion in turn mimic empathy? Any human unmo…
ytc_UgyBGWubo…
G
Well maybe you all people need to accept, that we need good ai in future. If mor…
ytc_UgyqCQMvK…
G
@GWT-qt by this logic, since you mentioned logic, say you have an established a…
ytr_UgxoJqBGi…
Comment
Very few people or companies talk about ethics in AI. You are still accountable for the code you are providing whether you wrote it by your hand, copy pasted from stackoverflow or generated by gen AI.
youtube
2026-03-30T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwh25fkOYdHAzIDXIt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugwg1jD6jimDJPKdh3l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzb79kPB2oId1lP8Ox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyHYyiaUy0IALWtDN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVm3sOErQK88taeft4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx4EpxI7G1QxH0vsgh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyeuWAGl7fcxLjn_Id4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzqvN1YYtxBJmBryJ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugw3sQKpCJmriKdU0iB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyFYNQRdU_wKAl-gFl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]