Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Providing AI with the traumatic reality of being conscious, this will be the rea…
ytc_UgyJbNLM6…
G
Oh, so we'll have more human interactions... with who? If 70% of the jobs are re…
ytc_Ugy2nqfVA…
G
If a car is self driving theres no reason to own it. The cost could be 150grand …
rdc_dbz4wnx
G
we are too an object of our perspective on things. the way we look on life, shap…
ytc_Ugw0_Wwgu…
G
Are you gonna cover the Larian studio situation of using AI for their conceptual…
ytr_UgyOkkYvd…
G
I wonder who wants to sacrifice their family member/s for automated trucks.Just …
ytc_Ugy0gR30P…
G
i don't believe that robot can become like a human it's a impossible because it'…
ytc_Ugxfe0pSF…
G
Ai is destroying the environment, so Ai slop bros are right about one thing: We …
ytc_Ugzg8MToI…
Comment
Umm I honestly do think that robots are gonna take over the world. If you belive there's gonna be a zombie apocalypse (comment) if you belive theres gonna be a robot apocalypse (like)
👇🏽 👇🏽
youtube
AI Responsibility
2024-07-14T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw0KrwPBWoRnrX5vCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwNQMaLELCV3kEqdfl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBs-OJPp32jSw5yCx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz8ZNWgNj6_zGb4owJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5dlBvhgbOtmE8VZB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2HzMhsZ_POe8tYBl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxNWtyTETXD9ssUYR14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyxcGe-j_Pguk58yu54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTABjsYDhhQEi6K4d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyyi6XtMftQ7EOoKaR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]