Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am an artist who fully disagrees with how most people are seeing AI.
Super sim…
ytc_UgxTuRsYd…
G
Sunita is going to ask AI for your address and come and get you when AI descides…
ytc_UgywqAV-R…
G
It's fun and all to think about consciousness with AI's. But I think right now i…
ytc_UgywwIuqN…
G
if we get ai that are at the state of (seems real) and we give them bodies to do…
ytc_UgyNmXQN5…
G
Like all tools, I think AI can have a positive impact. However, it's not the sam…
ytc_Ugy6Dnkkp…
G
Are we God for AI?
We are missing God on earth!
Is AI will also missing us.?…
ytc_Ugw9cAQMw…
G
This guy admitted hes trying to get robots to mimic human behaviors when the rob…
ytc_UgxHpyLVg…
G
They should take notes from Japanese and Chinese schools where they get taught h…
ytc_Ugy2s5q1P…
Comment
1 .A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
youtube
2012-11-23T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwbvkGhMJrrH_F2bu54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwfYxheMfLR3CgFwpx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4q5nJine-zAkNryF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxl8KKXeWiz-h5zeqR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzO2rKI7_VIVHcU8gB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbfF5VibJWpZZh1CV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyIuBGp4K_iiQWyVlt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwsfgKptJ7dhmqBh-14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_Ug24xi6RFcKHHyB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwo3-nhtFZ1ZSCRzn14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]