Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Note that the car moving over was signaling. Tesla FSD cannot even detect a turn…
ytc_UgyweeO4A…
G
Due to Robotics and Automation businessmen have become selfish and ruthless endi…
ytc_Ugw4HawHf…
G
Actually its not behaving like its not human but repeating what humans have said…
ytc_UgzYxHJCC…
G
Honestly most people can learn how to do just about anything simple. A lot of pe…
ytc_Ugza0ON_y…
G
Bridge players think we are better than AI at the highest level. Which is a litt…
ytc_UgwH3egAx…
G
I feel like I'm the only person that thinks silicone takes away from the illusio…
ytc_UgwVP3N3l…
G
Searches on Youtube: what is an algorithm?
Youtube: Guy searches on google what …
ytc_Ugx3e8BSt…
G
Just remember this is the guy playing around with AI. Ever think he has nuerolin…
ytc_UgwKMuHVv…
Comment
For the sake of argument, let's say there is "super intel" AI. In order for that AI to make things, it will need to mine resources. So the simple solution to defeat the AI is for humans to destroy mines so humanoid robots can't operate them. Do that enough times until balance of power shifts back to humans. There will always be ways to defeat AI machines because AI machines need resources to make things.
youtube
AI Governance
2025-09-04T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugws31M3yD0g2EFR0P94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwLZPY3H2YXFksQhIF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzOEgg0zzVdo7nnzmh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzThdn6MpE-uHgBS894AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxVAr5aLPlfmskOtyp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_vBkG3jEzs-0bvGB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwRTMEnWIrea5X3qXx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx28batO97SO5wvOER4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxiL69Si4te5t04z994AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgytFzT8EEKi8R_Gh1F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]