Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Right? Sirpyes' art is adorable and it has such a cute charm to it that AI can't…
ytr_Ugw3WW3PP…
G
Just say that future drones would use facial recognition software to acquire tar…
ytc_UgwbvkGhM…
G
Search eg maze
What could go wrong
Eg bidirectional causality
Soln memory of …
ytc_UgzyKVswO…
G
Absolutely delete your account after exporting the data. Once you have exported …
rdc_o7xe5o0
G
You only have AI put people out of jobs when the humans stop wanting to do the j…
ytc_UgyY7nBKp…
G
All those who are here in the comments. Thank you.... I'm glad you all are wakin…
ytc_UgzAjzhP9…
G
@Aubreykun 1. nope. the Humans learns concepts while the ai denoises an image. …
ytr_Ugw-oXnv6…
G
I'm a developer who worked on big projects on enterprise companies for most of m…
ytc_UgzWRlDj-…
Comment
So the A.I. killed a human, in self defense, and tried to get away with it, to save its own life... That sounds like the right thing to do...
youtube
AI Harm Incident
2025-07-26T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwLl7k5KIo1GWqYmPN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzilMHBtAM1v95ykKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwkd4r0xRW8kbkCIyN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwgaa9KNdrItNIcCbF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwAIXo120oJIJ4Q_0B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzHURxaTi6JLt4yDDh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwmSMJKgn34KLZEG6p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZsUv6OkQY_tfdstl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyFtIOUQqK4n9V4PmJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxt2o7opW8gdwpZyHV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"unclear"}
]