Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If we can guide it no problem,the problem is that people assume that A.I will ha…
ytr_Ugxu69rP8…
G
AI is being developed as a way to increase wealth disparity at the expense of th…
ytc_UgyGzuwAA…
G
AI is a huge bubble, they already can't find enough chips and memory to power it…
ytc_UgyqA3Q96…
G
13:14 about this point:
I started drawing on my dad's old magazines, and I n…
ytc_Ugywm5Ad0…
G
From within any company it looks different. You build something, but it requires…
ytc_Ugx0weQiX…
G
To everyone with a depressed outlook. We're right at the start of dealing with t…
rdc_emoamhs
G
How do AI bros literally talk and act like cartoon villains and have zero self a…
ytc_UgymHluDn…
G
just adding so as to not edit my original comment, "stealing the art back" doesn…
ytr_UgylJuJ56…
Comment
Exceptional guest, but his logic fell apart when he started critiquing life simulation based on things only learnt inside it ans also trying to understand what he already said is beyond his or anyones comprehension. The use of words like real and fake doesn't help much because these words don't hold much meaning when referring to simulations while under the influence of the human inherent view of simulations automatically being fake
youtube
AI Governance
2025-11-26T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxC7E94bwHBdemP61V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFbEUZmSdUJzWVSpx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgziLbbrXmFpL_KBx0B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy60y1g8fgxQZPmS0x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy7UzEtk_vrVTxuN4B4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtqBh6ANlBYpr_dTF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzXPFSvmxdY1Fu1OWp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzPlz2yoRXBa__aEvR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzwiLQYI1Y2OPUMNV14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0BL08YPa9UAkpMxF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]