Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't understand why they don't say it's an AI genreted art and sell it like t…
ytc_Ugy_W-hcv…
G
AI will take over the world.
She wants a family but she can't YET?
The moment th…
ytc_UgwSWLrDm…
G
Why worry about AI when forcing American workers to compete against nigh slave l…
ytc_UgxB1DywU…
G
Ai is like a new joinee whom you know is qualified to do the tasks but won't be …
ytc_Ugw78rIAN…
G
AI is not taking those things away. Those kids were not going to amount to much …
ytc_UgzN3nH6d…
G
What a question to ask a AI about
What’s next? Asking warlords on their opinion…
ytc_UgybMAVXN…
G
Google: We're gonna force you to use AI by making it so you can't shut it off on…
ytc_UgzP_TjEy…
G
This gets to the heart of cognition.
It used to be that 'smart' meant 'ability …
ytc_UgzUFkjB_…
Comment
We have had 30 COPs and have known a lot longer about climate change but cannot get governments and large corporations to agree to fix Climate Change, so what hope have we got of getting the same people to agree to regulate, manage or control AI which is growing at a extremely rapid rate. The short term goals of government and business do not align with the long term goals of human survival.
youtube
AI Governance
2025-12-08T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxMaONjkdb1fTusFJV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxh9C6p8IF8RDdukfl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5YEXyK_Hr6lEJLsR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzx2-ovee5Jup-_e8F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx_CIJ9q7tQjMF-CjR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwi1-5o9A-XUfWD8Ox4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgBDU_pygKKgdPzV14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJ1mRt3-Ff0N0E_gZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwACHNuX4tdxvC9Fnt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzCxHEySWFj7Ef-Ql4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]