Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"The FLOOR floor" @44:50. That's what that delusional, self-absorbed, sociopathi…
ytc_UgwaYcDs6…
G
Anton, AI+UBI = Anguilla's AI domain name sales funding 50% government and over …
ytc_Ugx8xqQ7x…
G
Think of it in reverse. If we started out with robot cars, but then wanted to al…
ytc_Ugjb5mbCY…
G
35:53 Bro, no; *I am confident it is not having internal experiences.*
Conscio…
ytc_Ugw9dgpge…
G
Most of the media and General opinion of ai that it will destroy humanity and si…
ytc_UgxwwqEGl…
G
Keep up the automation! We need to get rid of servers next then Uber/Lyft driver…
ytc_UgxWMdM3f…
G
AI just openly telling us its going to destroy us is wild. Also where are the so…
ytc_Ugy-vEoNG…
G
This is sad, they keep trying to take our jobs away and they actually think AI w…
ytc_UgxY5k1mw…
Comment
Jotted down the highlights with TinaMind:
3:04 - 3:42: We are facing a critical safety gap where AI capabilities are advancing exponentially, but we still have no proven way to ensure these systems remain aligned with human preferences.
10:07 - 11:37: AGI is predicted to arrive by 2027, potentially leading to 99% unemployment as AI replaces humans in almost all cognitive and physical occupations.
18:56 - 20:01: Super-intelligence is inherently unpredictable; by definition, a human mind cannot comprehend or forecast the actions and technology of a much smarter agent.
25:56 - 27:04: AI represents a paradigm shift as the "final invention"—it is not just a tool, but a replacement for the human mind that will automate all future research, science, and even ethics.
youtube
AI Governance
2026-02-18T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkeNDMXtbPijAttV94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwTPoh-QPj3qPPkkyx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1UmXow4W-2336UO14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbdvGDZ67AohYivOd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxH1Ba9ufiwtUJkPPF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzVbxuWfYt_2-wmPd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzyQrLssUQmiXhJhWl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGqulGF4_qvNvD1d54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxITGIhQwHBneNT2Cp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgypAAVXd0Tm7mwtt1d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"}
]