Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hi @mrEC, thanks for your comment! I appreciate your concern about letting his g…
ytr_UgwyOLbRz…
G
I don't believe they should stop it before it reaches its limit.
stopping it is …
ytc_Ugw-06Sa1…
G
Just like in the movie, someone heading up "predictive policing" is actually the…
ytr_UgwJxZ-VC…
G
@alko_xo no, did you know a self driving car ran over a women and then dragged h…
ytr_UgzTtgXSN…
G
Haha, that would be quite a sight to see! Who knows what the future holds with A…
ytr_UgywoO4KD…
G
More to note, he mentioned auto pilot, which is the one used on highways, while …
ytr_UgyPDXJwT…
G
I code with all best models and they all have the same "personality". In essence…
ytc_UgwI7vyR4…
G
I know ibisPaintX (mobile ap) has an AI disturbance mode that you can use on you…
ytc_Ugw0IAQwT…
Comment
I thought world wars of AI would start at about 50 years but after hearing father of AI again ,this world war of AI will start within 10-15 years at max. I am perusing my studies of AI and machine learning and it's quite frightning while thing of rouge AI vs the AI with humanity. And there will not be any religions in next 20 years. Just united humanity versus the rouge AI. We are about to see wars of AI. Hope my studies will help humanity.
youtube
AI Governance
2025-08-05T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw02b6PRCbInE9jwLt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwDVRni90FjmBjBNWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwD3cUB5kq4UofC1it4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgziXJIYjupUIGPpAX14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzQrGuN0Yas3-JP2P14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxEY0gH0MNufBkQGn94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwv0NJsBl0XFfYSmb54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyIhSslAV0Oyp5QEPt4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzENXED8EwYWeB0iXt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxMoOtGMiqSJgrGuY14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]