Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Computer robots and AI will "Farm" people. Work, school, home. Absolute slavery…
ytc_UgyMuuNnS…
G
Lav. I almost never comment but it breaks my heart to see people insult trying t…
ytc_UgzAZe3ev…
G
UBI ? AI Alignment ? Anguilla AI is the accidental UBI Prototype MVP, domain sal…
ytc_UgzkiBzD5…
G
Every micromovement and presentation is programmed. AI will only "take over the …
ytc_UgwjoOZfZ…
G
Yea, so we will now have the AI trained on a binary dataset, one with images tha…
ytc_UgzHubQ8J…
G
Who cares if it turns out ineffective, the whole edu system is already ineffecti…
ytc_Ugz1CwGmk…
G
The police and everything else would come after me if they saw my AI chats becau…
ytc_Ugx416POu…
G
Haven't you noticed the AI chatbot that takes up half the screen in every goddam…
rdc_oi2ewaf
Comment
@SuperTemich2005 F-22's still need human ground crew to fuel and rearm them though, so while they'd be a threat for the brief time they're taken over via remote hack (if that's even possible), they would run out of either fuel or ammo in short order and become a non-threat. Again, without humans doing almost all of the work nothing can get done in the world as it currently is, so even if an ASI did come into being, it would not survive without us, much less be able to operate physically in the world for a prolonged period of time. I don't think we have to worry about ASI while we still don't have 100% automation as an option for anything industrial or military 😊
youtube
AI Governance
2025-08-27T01:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzVV7sqMNfyHNBE9HN4AaABAg.AMISDRGbPv6AMITfx7m3x-","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzVV7sqMNfyHNBE9HN4AaABAg.AMISDRGbPv6AMIo5wbtxaH","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzVV7sqMNfyHNBE9HN4AaABAg.AMISDRGbPv6AMIr3GHg6Jp","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwZ0YqMbvvnWd3dP8h4AaABAg.AMIQn_XuqVbAMJDM_W6vcz","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwZ0YqMbvvnWd3dP8h4AaABAg.AMIQn_XuqVbAMJFs4HL8V0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwZ0YqMbvvnWd3dP8h4AaABAg.AMIQn_XuqVbAMK8SU4PBIK","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugws38Dp8DfrtcM7b194AaABAg.AMIPmsGfXAnAMIl3p_dSVF","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwBvlYr2UcOhyTnDKF4AaABAg.AMIP9ac57itAMIp_absgEu","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgxnIFUI0T_CoYa8uwN4AaABAg.AMIOPUHnCsVAMKmXBitGjR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxhQ46PziHBHmXw99t4AaABAg.AMIMxqpcIu5AMUOXRxziZi","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]