Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
chatgpt told me i'm a genius. who am I to argue with it? It's a genius! - WW…
ytc_UgzH_X1qo…
G
I am so happy over 2 million people are subbed to you. I despise AI. You are a w…
ytc_UgwJtdmjy…
G
The guy thinks that drawing is like an RPG perk that automatically gives you the…
ytc_Ugyqpu73N…
G
Umm, did you read what he wrote? He is still offering to give feedback to anyone…
rdc_nu0u68n
G
Wait until the government catches up, so much of the bureaucracy is subject to A…
ytc_UgzcmAP6-…
G
No, AI will *not* take all the jobs. Not even close. Everybody just needs to tak…
ytc_Ugy6gYAYW…
G
u can have recession u can t have mass wipeout ....why.....cos no matter how muc…
ytc_UgyEaDCxG…
G
In costumer service we are specifically told that what we say reflects on the co…
ytr_Ugzt6Pph8…
Comment
The predictions of what ASI will do in the future are meaningless. The hubris of humanity thinking we are alone in the universe inserts itself at this point. We are not alone, we are not the first to face this daunting future, we are also not going to be allowed to create an intelligence that becomes a digital 'God'. The total amount of human capital that has already been put into AI up to this point is greater, by far, than any other endeavor mankind has participated in (other than WWII which we will eclipse financially next year). As a people, why are we so desperate to give our lives over to some 'alien' intelligence? Hint: think tower of Babel.
youtube
AI Governance
2025-09-05T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyCb-uk-VCqg-vhNiN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyXljJJTVVfRx-MfJN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvW3Nld41qgIhCcQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxHT5EdCGJZwGXhCgh4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwzBDv8Aded4KZbLEh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztMHfoipBgtu9cb654AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4FUrrI04c5LO6r1R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMqMFSjjxOQFoGsrJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx1kh2ue-UQ8crXe_R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzadK_GolPKxGUoPDF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]