Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s not AI it’s idiot CEOs having no fundamental real business acumen or human …
ytc_Ugy8HVxH-…
G
I'm calling it: AI "art" in all fields (not just visual arts) is gonna entirely …
ytc_UgxoZVFyT…
G
@BugIBottomtopfr it's always the same one video freaking out over AI art withou…
ytr_Ugw64-FXV…
G
At least it helps to prioritize the CXR, so Radiologist reads and reports the p…
ytc_Ugyh0D53s…
G
This is done intentionally so that people can be programmed to believe AI videos…
ytc_UgyB_ECWl…
G
Stop...anyone who thinks AI will take over in a few years or only 5 jobs that wi…
ytc_UgxR6Yhcu…
G
AI companies need a story to tell investors about why they lost all their money.…
rdc_nip9n2y
G
If you think that the 1% won't end up owning even more of the global wealth beca…
ytr_Ugwhp5CF0…
Comment
We (and by this I mean, those who are paying attention) knew the Singularity would/will arrive at some not too distant day. The real issue is the day after it arrives. Exponentially, humanity will lose control of ai at that point. This is why Elon has been pushing ai/human interface technology. The only way for humanity to survive the Super Mind is to meld with it. The question now is, does Neurolink have enough time to prepare?
youtube
AI Governance
2023-03-30T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw2fk2Kp2ZS__IEk3d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwklHhmSyIKki-dGxd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzFl5Qnvd4D2e-9RJV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBbjHdNEXCBjidVfZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzOEgFpGb1O8aFy9l14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyn8vRMgBLNpaTFoeJ4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcQDx4AdRgBphhK2V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwpIwmMStqU8q1B78x4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwKdCGtVWu6CoEWmrl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWqGaASr6elrTgeeR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]