Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is just the industrial revolution all over again. Just let the AI prolifera…
ytc_Ugy6eK9ND…
G
What can we do:
1. Move to a 32 hour work week with no loss in pay. (We are 400…
ytc_Ugx4VTDpv…
G
What? Thats not why people dont want their job replaced by AI because THEY DONT …
ytc_UgwEm7pRQ…
G
A robot drove itself around a wearhouse picking up heavy pallets of water and lo…
ytc_Ugjy99-cl…
G
Wasnt he community noted in twitter for his tirade against India and later just …
rdc_o0n32ge
G
id love to see any sort of ai install dry wall into groved doors and windows in…
ytc_UgxzMMfSW…
G
I love the "oops, that's a clip of me doing a Google image search" cause that's …
ytc_UgzqRSzq7…
G
@DrQuadrivium and let’s not discount the possibility that separate AI systems wi…
ytr_UgyE5B-3t…
Comment
I'm stilling hanging in after 1:12:00. At this point, when Dr. Roman says that all we have to do is find the genome to eventually determine that we are capable of living forever -- AND his wanting to do so! -- I changed my perspective of the man. He's definitely highly intelligent and remarkably interesting, but to live forever is something that I consider highly unmanageable for eternity. Sometimes, I think that overly intelligent people spend to much thinking on trying to outthink the human spirit and its unknown and miraculous potential. Just my two cents... And, I'm all for AI safety! SB asks brilliant questions.
youtube
AI Governance
2026-04-18T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyPnIRyehKYH1nrHeJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxMLgWi2f--UFr5Bf94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwXuKeSV1J64c-_fYZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwcXAcKWmRrEi3cxnt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSC4CwJdPHnwDgnVd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6-22jquack5R_qoV4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxPcO6VlquF7xCh_d54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwexbnhbffvJOgSSkR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwd3Ht_tZVElpCQ2PJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzDGweZcivDtEf12qR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]