Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Before I ask something in detail of lengthy I give the AI a persona relevant to …
ytc_UgzKwGH48…
G
They are not using the AI right. Chain of inquiry. Inductive and deductive reaso…
ytc_Ugz4q_KMK…
G
This was one of the replies in that article, it's an absolute gem.
"I paid for …
ytc_Ugx9aRBei…
G
Thanks for your questions! Sophia's appearance, including any sweating, is part …
ytr_Ugz0zEzJv…
G
Why convince a robot to work on the basis that it will receive pain if it does n…
ytc_UgxkgXw-9…
G
AI actually not the cause of layoffs... Belive me after some years this AI only …
ytc_UgwkgX14O…
G
At one point, we will be so much broken that we will have only choice to be one …
ytc_Ugx83_GhI…
G
In my (limited) experience, A.I. very often gives incorrect answers. Yet, people…
ytc_Ugx9FtfdL…
Comment
I don't think we should take the idea that robots will make more mistakes than people or that the robots could be unsafely programmed as a fact.
One of the criticisms of the robot Google cars was the exact same thing and the only crashes they have been involved with with have been caused by human error on the part of other drivers.
I think we should consider that the robots may actually net prevent more loss of life than human GI's
youtube
2012-11-24T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwPNJWWzSIH79l64vN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSMMJM6rnm4N1egx94AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxiopH9Vfh_97idj6Z4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzBu3ZmFRC2yrWJXL94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwT-c0C8STpeKHGiVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzYxYtjK4BVvevNrtR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQ-JR8vpyZMutie7l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz9wOEwP-AlP5qNF1B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxlphsdKc_WmTZS0IB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxV5UloosgFOLrnVll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"})