Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@stewartjensen2253 look at the robot feet the robot feet is not touch in the …
ytr_UgxFDHOjr…
G
the video was cool, but no clickbait was needed... it could have just been: "an …
ytc_UghNEDqYU…
G
I need fellow artists to understand that people genuinely think they're entitled…
ytc_UgxVC5bP0…
G
You 100% would need to have a driver in these trucks even if the truck did most …
ytc_UgxcK6r2o…
G
I feel like AI is increasing the skill floor of a lot of fields. The best are st…
ytc_UgwSet7r0…
G
@4do@4doorsmorewhores34 yeah I'm well aware. It's a lot easier to stick around a…
ytr_UgwYpOvdB…
G
The first five minutes did an infinitely better job describing the potential dan…
ytc_UgxoTkFV7…
G
Speaking as an AI expert who has worked for a market leader...I can say that his…
ytc_UgyomhAJd…
Comment
Agentic AI, HERE NOW, is already artificial super intelligence. You have one agent to manage the work group, then you have several other agents doing the work, that thing in the aggregate is superintelligence. each agent can be of average human intelligence. if you have half a dozen people in a room, their collective intelligence is more than any individual's intelligence. We already have agents who surpass human intelligence in their target field. gather them together in a group. ASI. Suppose your field is playing GO, or playing CHESS, or taking standard math exams to name a few. ASI is already here.
youtube
AI Governance
2025-11-15T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyNQriovYRslFK1KXh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugym5BL5pVKNuuaMzm94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyN0py9lKHKssW0J9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgytatUjUliz3BuEC3p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzJUz-m9Bv9vFRhVE54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxhR1xLelCKmvY8TGh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzS4AXw33mQstsa1-Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxZ-6VzPHArZxxLz7x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFJ2WI0XVX2-WcaWV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_j2neY0Vhwk5_hWB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]