Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
this guy is a joke. hahaha. I am building a garden would box now. i can't wait t…
ytc_Ugz64sbcT…
G
AI will never be conscious. First of all it isn't even intelligent. That is wh…
ytc_UgxNSShD2…
G
Hi Rosie Great advice. Common sense + smart details. People esp seniors
willing …
ytc_Ugxfaml4J…
G
For the Ai Bros, I'm aware you may not understand the comment above, in summary.…
ytr_Ugyj4ycP9…
G
So humans create artificial intelligence designed to learn and improve itself. T…
ytc_UgxKb0gQH…
G
I heard there's a bunch of AI tools for UX designers now. I worked with some UX …
rdc_kjntso8
G
Nobody knows basically anything about how the human brain works.
The failure of …
ytc_Ugx62XveX…
G
Appealing for copyright is a great idea. I think it's great he's doing that. No …
ytc_UgwEqRSRs…
Comment
Our entire energy system and supply chain involves a lot of manual, human intervention. AI uses a whole lot of energy today and a fully sentient AI, I imagine, will require a solid double digit % of what we currently consume today. Killing off humanity would be an act of suicide for the AI. Eventually its energy supply would run out and its electronic parts would wear out. Nothing in our supply chain is fully automated. I believe that AI and humans will need each other to survive and thrive.
youtube
AI Governance
2025-07-15T19:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyT-lDt4NkEopQooL14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxANt9NTpVtnVZzlmB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwM3K_xern0h4VKYdR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0pEvDXBk7cUPicjR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxEtK_-Gi7OMJdq-SN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzasR9IjN9OevMARo94AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyZjiIWmcQe6c4EKdR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwsXCfUJ3wL0c4TJV14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3GMTS-OYV5g2JcR14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzL3dkej-JXqvyrItB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}
]