Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The thing about just slowing down and regulating ai is u can't... ai is just lin…
ytc_UgwFaixwC…
G
There will be mass ai hacking just you wait and see only the internet computer p…
ytc_Ugx2z03sW…
G
Automation has been taking jobs for centuries. I believe one of the Queens of E…
ytc_Ugy1CoKfi…
G
At first I thought id already watched this video but then i realised it was ANOT…
ytc_UgzZWT68d…
G
It’s crazy how people type in a few words and get art like this, then lie and sa…
ytc_Ugx-3laKm…
G
No jobs mean no consumers and no taxes. This is the problem in Africa and the re…
ytc_UgxDHkhw5…
G
This is a recurring question on these kinds of posts and just kind of ignores th…
rdc_mral524
G
@RR_reunificationRights another robot: "I can see Mr.Cotton's Address right her…
ytr_UgwEut0s9…
Comment
@nextgenai-m2b I believe that what is created for good intent can also be replicated by those with ill-intent. The internet is a prime example. It is something that needs safety standards put in place so we also don't create situations such as the movie iRobot, not just referencing The Terminator. Humans are flawed at best, and so will our creations be. Nevertheless, inventions have achieved great things! Ai should be explored and learned about. Learning should always be pursued. We just need to make sure it is controlled so we don't invite trouble, or that it makes it hard for trouble to come of it.
youtube
AI Moral Status
2025-07-09T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzsK41_qtIW7gwOsmN4AaABAg.AKL6uUujWDTAKMR-mxA9Dk","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzcwWq-SjZnyEO6dOh4AaABAg.AK6iLSm7btHAKMv2IRe5-G","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzcwWq-SjZnyEO6dOh4AaABAg.AK6iLSm7btHALjtTRxuOKl","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytr_Ugzvvt6XfTofa4WIuat4AaABAg.AJwVsPYic_bAK1MrgheFl7","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyR6z53fD_QVjawcWl4AaABAg.AJgcYuwoistAK1NEhkFFYA","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgwT3QTwN_ajO6udkhF4AaABAg.9tEW6r6NKHI9tEWFFxQvGU","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugzx0VIXRkoQWIbFvzF4AaABAg.9rohkKCbtIn9rpWVOLtjth","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugypa-kLF9uJQ46xfgR4AaABAg.9rnwUEc0Rrd9rqPdJyWb-A","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgztIuveZr2z9_S4wap4AaABAg.9rnjC_jLEY_9rvvcLokV8A","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgzbR6ZYGqSuQyYhBe54AaABAg.9rn3EJfLhGR9rxAJ7ibdiv","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]