Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A decade is a huge amount of time in AI. It will be superhuman within 10 years. …
ytc_UgyEB0GAH…
G
My friend's 6-year old is obsessed with deep sea fish, and he spent a few minute…
ytr_Ugznis7ps…
G
Download a bunch of the Epstein files and start using them with ChatGPT to ask p…
rdc_o7zl0sg
G
That's not true though. I took a self driving Uber in Pittsburgh last week, and …
rdc_dfu19y8
G
Data mining becoming more public. Being able to find anyone’s address, phone num…
rdc_ohez95y
G
Why would we pause ai development when China and Russia will keep their developm…
ytc_UgxDOM1en…
G
> Just imagine you're your companies only employee and then you quit... LOL
…
rdc_cjorno0
G
Your argument at ~5:04 — that consciousness is "the understanding, not the use" …
ytc_UgwZF0MEh…
Comment
I believe that if it’s possible for AI to “take over” then it will happen, it’s just a matter of when. They make such vast numbers of calculations that statistics says so - especially since they’re consistently making more and more powerful AIs with bigger computers and more efficient software. This means the only way it won’t is if we make it impossible which might not even be possible without first seeing it happen, by which point it’s too late.
youtube
AI Governance
2025-06-19T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwfvQStg5o-xnWi_cB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxOPSJ6JhCOKJwbCF54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugylb93HzouaXF5YipV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxIJ9KWwgYK2XO86694AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz9yojJHyzhreUdaEt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyRKNzClFHDcy_6Cdt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzOmLDAXTTwoeKZAs14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzDqkeZc-q7mauA1lt4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyiUuevM-PRDB8J-fh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwZFjUpCeY79tzeEPV4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"indifference"}
]