Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So Jordan Peterson is an AI? That actually tracks. If you define intelligence is…
ytc_UgzWDKO3X…
G
This was kind of inspiring, although weird. So, AI is going to annihilate us, bu…
ytc_UgyIuh_Dz…
G
Is AI being smarter than humans necessarily a problem? Some humans are smarter t…
ytc_Ugz2na1of…
G
7:26 would you hold a Swiss army knife accountable?
We barely have what would b…
ytc_UgwJAGJMG…
G
no it wont, the issue is if AI and robots do everything then who is the capitali…
ytc_Ugxwl-gqD…
G
@515ventures3 you are right :) but imagine overreliance on ai will give it mo…
ytr_UgybgzILA…
G
Yet people think AI was take over and rule us all .... It's a tool but you can't…
ytc_UgzepaYWK…
G
CAN WE SAY, “THE RISE OF THE MACHINES”? CAN WE ALL SAY “THE TERMINATOR?”
THEY W…
ytc_Ugxx-VX5k…
Comment
The one sigular caveat to super intelligence and the main reason why I don't believe it's POSSIBLE. Is that humans definitively built the pyramids. In 2025 no AI no human can build a pyramid. If an AI could build pyramid equal to or better than pyramid of Giza, I'll believe it is indeed 'super intelligent' until then, it's all hype to me
youtube
AI Governance
2025-09-04T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyBgA8eg3QfPQNdxhV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJ3_vaZfStjyuTxcV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyr9syJi8XZ2nLbYtV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugymqa81kwy19MFsOUF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyazVQDL2om0VrIXSV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyMgOApBDt7-1X3BXR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtYknA8AwPvtAM-DJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwxOqyP1K_RWjWc6w94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw-9_jFvvNfkuWbH854AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDw4WNNK-6v5pu8Wh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]