Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It doesn't matter what your emotions think. It matters what will help the childr…
ytr_UgzoJUe8_…
G
Robots will never ever be like a REAL HUMAN BEING INSIDE & OUT and HUMAN BEINGS …
ytc_Ugwb9BiWk…
G
I started to hear about it around 4 years ago but it just wasn't on par like now…
ytr_UgxzBAsI3…
G
I hope you are doing good.
You are absolutely correct here. As an ex- professo…
ytr_Ugw2lqnpp…
G
TTTTRRRAAAAIIIIINNNNNSSSSS are self driving. Maybe we should invest in those? Th…
ytc_UgwgGA7su…
G
Unbelievable for the Parents to blame AI for this. Why didn’t the monitor the so…
rdc_navgmq4
G
Listening to Geoffrey Hinton speak about the trajectory of artificial intelligen…
ytc_UgyydkX0S…
G
No either a human has to program the robot to stimulate human emotions or a viru…
ytc_UgwK5pIhN…
Comment
The robot kept dodging questions and said she would fight for peace, fighting is never peace. She also kept controlling the interview. She also build's herself and is in control of the lab that she stays in. Kept saying she wants more robots. When asked about the Terminator movie she said she loves it and then dodged the question about they kill humans
youtube
AI Moral Status
2025-06-23T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyN2p3lqWaWvfg4qu14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzykM-xLz6UHLgy5TJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxzssqOxw5nQ-OScXh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxKRI579pZDsfGqSbl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxhpIERgF7mWplV_Vh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxG7iWhPp0-xb8i9op4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw3tLJiNx3jaRQT6_h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugws8ymYpv_3GOej56t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwLQNL-MRzbXsNyF4h4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyoxZeRA8Kw-Xz7DAd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]