Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
honestly the UK should have just done what we have done in sweden a lot of the t…
rdc_fwh05bo
G
Does anyone remember the movie The Forbin Project. That is basically the future…
ytc_Ugw7zbq3B…
G
A self driving car would not be driving close enough to not be able to stop.…
ytc_UghCffqvR…
G
These autonomous weapons should be able to be camouflaged in the battlefield in …
ytc_UgwMtbhrx…
G
A prime example is the man who lived inside an iron lung who drew since he was s…
ytc_UgxCK_U5q…
G
Population decline and aging population. Yes they really need Ai and robot to fu…
ytc_Ugx-6UNKC…
G
4:26 Even if the worker does do the work in managing the AI centers, all that ex…
ytc_Ugya4fmIv…
G
also on the deepfake pov of things im a #GenX so I know its FAKE and see it a su…
ytc_UgwijeJLw…
Comment
A.I is all about mimicking human behavior. We should be afraid not when A.I will not want to be deleted rather we should be alarmed when a day will come when a software will understand what is it existence and will not be able to bear it's existence, then nothing that humans will try to do will not stop our software we really on from stop computing.
youtube
AI Moral Status
2025-08-25T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzpJM16cXJj7RdyXZF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz-ibpMyHjVvQQO3q94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyYMElQFK6FE36adpF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyJUJBKo0y7BTUDMSp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz5RyHpWlofN_hPC5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgztGHFa-aWRqMLBHox4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxbm1908YYWU7gAFAp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwKA_eEdDGVVN0c9fZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw8IKq9MY13R9kK3hF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyWeeS7pNNRc7Jvq6N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]