Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I spoke with an A I twice and I didn't like it. I know that I am quite far more …
ytc_UgyhCVy1I…
G
AI, like other inventions, can be good for mankind. It is those who use it for e…
ytc_Ugygo-AzI…
G
El día que los robots dominen el mundo nos van a desarmar como nosotros los desa…
ytc_Ugz8Lfgst…
G
i agree so much with this. The base expections for all of my friends and me in j…
ytc_UgyhMHY4C…
G
Just 1 of my ai answers of the many agendas: "Merge Human + Machine: Not to expa…
ytc_UgxBaP3kY…
G
Well if you can tell chatgpt to sayyour sponsor you can let her say anything thi…
ytc_UgyoR8AAR…
G
If the usa with a smaller labor force is worried about Ai and automization. Its …
ytc_UgxVOYdV7…
G
Until you brain is digitally cloned and you become a window cleaner, NBC in gta …
ytr_Ugwrv-80O…
Comment
It is extremely dangerous, and I feel that this discussion has not really grasped how much so. Regulation? AI will always find a way round it. Weapons will be (are being) designed by it, wars will be fought by it. It will take decisions in milliseconds, and the experts will not be able to follow its reasoning, but will just do what it says (or AI will do it for them, probably). Will it fight our wars for us, or will the wars become the wars of AI? That, surely, is where the existential threat lies?
youtube
AI Governance
2023-06-13T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyLz_SYDbl15lKzAGd4AaABAg.9pWfJRfGDQY9pWjUFRq4QI","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugz5cGMHLoKEApqHnhl4AaABAg.9pWeKAF6mjO9pWmEXtxl2P","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz5cGMHLoKEApqHnhl4AaABAg.9pWeKAF6mjO9pXV1QgiSqJ","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytr_Ugywi9ivsNsjeRpGG7p4AaABAg.9pW_gx9y5gj9pXhDNJsjgL","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwZ051mj8S6EKP92m54AaABAg.9pWRudbogXj9pXiVrZ2ZJy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwZ051mj8S6EKP92m54AaABAg.9pWRudbogXj9qQiHX7JVWp","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugx0qmuita12sozuOld4AaABAg.9pWAp5ubrGm9pXZlOlq-yS","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx0qmuita12sozuOld4AaABAg.9pWAp5ubrGm9pXlZpMvaJf","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyhHQAPj8mqCuF_HG14AaABAg.9pVzSReIfrw9pWp6hZ-Apg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgznmX6mCwSQ2mHFEc54AaABAg.9pVzJQnSe-49qudDmrleLy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]