Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But AI is built in a way that it needs HUMAN INPUT to do something. A request or…
ytc_UgzUXMkSB…
G
I remember when people saying digital art doesnt make you an artist. Movies have…
ytr_Ugy-KEbRb…
G
After my recent conversations with Gemini, I absolutely understand the answers. …
ytc_Ugwj_Rn2c…
G
0:26 if AI fails, many people will lose their jobs. So what? Those people were o…
ytc_UgxMhERf6…
G
ai is not supposed to generate art and videos bro, they are supposed to be like …
ytc_UgzicM-e8…
G
I support ai . ITS cheaper and less problem than working with artist. Ai Will t…
ytc_Ugy-0tD2f…
G
work daily document:
rob and dave was a robot that are working in the factory, …
ytc_Ugx94JLqs…
G
i love how the ai bros comment a lot on the videos who say its bad, but when the…
ytc_UgzmVq5_H…
Comment
Gh0st G0D Minors know I-Robot, and the elders still love SARAH CONNOR!!! ;)
Just kidding, it's the plot. In Terminator Skynet was a self teaching system who realized that earth will be a better place without humanity. In I robot a usual robot was abused as assassin. I don't wanna spoil the movie so.. primary it was samys function.
youtube
AI Moral Status
2017-09-10T12:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgjRAdI8CBX503gCoAEC.8RYbZRMVV388TmJdfB7ljb","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UggAfP70fulRKngCoAEC.8RPWKyaMVSz8XKbAmV0XCx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugjq1LHczAfLQXgCoAEC.8ROJRwP4TCO8RUM9JsqJZd","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugjq1LHczAfLQXgCoAEC.8ROJRwP4TCO8Rn-k8Xkol8","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugjq1LHczAfLQXgCoAEC.8ROJRwP4TCO8Rp-HE3HUI3","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgiqO9WO8_dT53gCoAEC.8RLiNk-a3Sz8RWUUtZLxO7","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgiqO9WO8_dT53gCoAEC.8RLiNk-a3Sz8RYYmdBWDZF","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytr_UggdqnahcVPoZngCoAEC.8QhjYzvmuzV8REdhftyO_7","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzrE6a7g5XiyUy0DId4AaABAg.8QfyH6h_Eri8QxAOVMTLVE","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgwDIAtXVMIeXnpBj4x4AaABAg.8Qfc-DHQD6Z8RO4mG0b91U","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}
]