Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I got some a.i's to crack a bit , i do not think they are sentient, yet , i beli…
ytc_Ugy6vpaVe…
G
~1:14 Neil touched on this, but there's a big problem with alignment of interest…
ytc_UgxidZt7E…
G
Thank you for your comment! It's important to continuously improve and upgrade o…
ytr_UgyJAr77l…
G
Most dangerous man on the planet. If an ai take over happens I can guarantee it’…
ytc_UgxeBp2SY…
G
Also, use Version History in Google docs to show all your revisions. If OP actu…
rdc_kgqbtbh
G
I’ve heard that LLM companies are actually suppressing the rates of Chatbot psyc…
ytc_Ugzaa847C…
G
merci a cash investigation. maintenant le monde sait comment on exploite les pet…
ytc_UgzW3clf2…
G
I knew robots was taking over the world when the I Robot movie came out 😂…
ytc_UgyKuNVsy…
Comment
The future in action and it looks like it will be a great and bright and very happy one. There is too many people who think just because some movie showed them the robots killing humans that it will be the same in real life with the new AI robots that were shown in the video. I have to tell you that the movies were made to have you afraid of what could happen to make you want to see the next one that director made and help them make more money. The robots that are being made will help humanity, and will not be anything like it is shown in those action movies.
youtube
AI Harm Incident
2023-12-28T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwlWXqCvCfloYXT9JZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwEqoqPpr2DNn2fG1N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxcqaJCLtzt1iJOQMh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPHVRbJ_eVBihodTN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxbOce0JxHMQ-2Cih94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzPUp2LEZOBCax33_l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxe8VG_6PjonJ1Fxhd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHftbS4jDqKwjvNOh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgybM6Xy0MwsZh7yBJ94AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxL21e0OYVw1QxZu514AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]