Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What a great relief to listen to such a brilliant young lady with a lot of know…
ytc_UgwqCxoZ_…
G
yes I also noticed the efficiency of AI coding agents. I'm actually playing aro…
ytr_Ugzv5PK9J…
G
Ba ba
Which is why this is a human-caused error, not a reflection of how good o…
ytr_UgzXd3ZWg…
G
The dangers (plural) of AI include the more complete knowledge of what motivates…
ytc_UgxM30UQS…
G
Made by Artist is better. made by AI kinda have no meaning also so ugly…
ytc_UgzRO9WCw…
G
I really think ai images should have something like a watermark that can't be ta…
ytc_UgygVRoyB…
G
Jesus is coming back, AI was outdated before it was even conceived and is no mat…
ytc_Ugy5EEMgU…
G
The creator of AI now warns about its dangers. Likewise, the discovery of metals…
ytc_UgwaJLb1O…
Comment
50:40 before Ilya and Roman calling out major issues with ASI safety, there was Hugo de Garis. Hugo was interviewed in the 2009 Ray Kurzweil documentary “Transcendent Man”. Hugo was the dissenter to Kurzweil’s optimistic Ai future. Hugo predicted the death of billions, primarily thru world alignments for and against ASI. The result is a devastating world war. Hugo coined the term Artilect which he thought was better than saying Ai, but it never caught on. He did write a sci-fi book called the “The Artilect War”
youtube
AI Governance
2025-09-05T03:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyOrZ_M_PMVAo0JyKN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyTuGSBUl5h69IAHJZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugypk2pN6fTst_VSLDp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw0P5hzgah8tf99fgV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugx10RkPeGg9cWqBpkV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_UgwlRdhppltx_IqKU4p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzkTDkQ-awHVmJ7hP54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},{"id":"ytc_Ugy3T7MnAgI9agOfqtB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},{"id":"ytc_Ugxj2WNCaKkfb6AyEEN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxcpfWkD8iyRWA3rW54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]