Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hear hear. Love or hate Sen Sanders, this is a stark and serious warning. A warn…
ytc_UgwTQYjcT…
G
What is your argument here? The invention of machinery for factories during the …
ytr_UgwftQqWI…
G
'The rise of Artificial intelligence', well prehaps we should consider a slightl…
ytc_UgwsnBZMP…
G
Wow, I was actually ahead of my time. About twenty-five years ago the parking st…
ytc_UgxTezW5h…
G
AI is already reducing the amount of developers companies need to develop an app…
ytc_UgywUe8rf…
G
1:19:47 How do i lubricate my sex robot? I love a scientist with a sense of humo…
ytc_UgyZBKaVq…
G
Both can be true about good effects in AI and negatives . Scale of each scenario…
ytc_Ugw5cmG_X…
G
I'm more inclined to believe LLM will be a dead end in the path toward AGI, thes…
ytc_UgyTolRgY…
Comment
i see the danger in that we will get pampled by AI too much. AI will ger smarter and smarter and we will be getting dumber and dumber. We will be leaning on AI too much and we will have no understanding of how AI works and no understanding how the things work that AI came up with.
You will get an answer to whatever you want right away. People will stop second guessing the answers given, or even think about the problem at all. You can ask for a picture/ a book / a movie.. there will be no need for us to work, to think, to create, to grow. Maybe there wont be a need for humans at all.
youtube
AI Governance
2025-06-24T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyolKgzen8ewYmRVg14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy6alxdRnqQ1YvAk9F4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz-2CcJGtGyGMNVgXZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwerruDXJiXyR6nTEF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzxjqv5GYYxWjLswZN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGjyK9dW5IcR3nRrt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzknxbakj5ngyG4oOx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyvqZi5XV3wEAE2CU94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy674Yux2-5xrsDW254AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyUyZ5dL--3vEmddFR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]