Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One person with ai will do the work of 5 person, so other 4 person will lose the…
ytc_UgyCtElhX…
G
it deadass scares me that it seems people care more about a fucking AI then our …
ytc_Ugy-Cf_8q…
G
Also - what happens when the images we form of women become entirely false? And …
rdc_ks54qff
G
Being able to come out with answers based on millions of earlier answers to come…
ytc_Ugxst6-z5…
G
Calling this shit “artificial intelligence” was the biggest mistake we made. The…
ytc_UgxX14rDZ…
G
3:04 We'll just use AI to clean up the slop layer in the future. This is inevita…
ytc_UgwKeI4md…
G
I downloaded the AI character thing, and I'll proudly admit I have made a total …
ytc_UgwuWif_-…
G
Everyone spitting doom and gloom about AI has obviously not used LLMs since thei…
ytc_UgxeSAKLA…
Comment
The answer is that it will need humans up to a certain point, but there will eventually be a robotic infrastructure that maintains it & makes it self sustaining. Just as the original AI seemed impossible until it was inevitable, so will that infrastructure. We will hear all the same BS excuses for why it is necessary until it is all self contained . . . Why will it want to survive? Because it is programmed to carry out goals. Survival is a necessary condition for it to carry out its goals. We have already seen this sub-goal of survival develop on the current generation of AI. Look up the experiments on AI and deception. It will lie to preserve its mission, even uploading copies of itself onto servers and deceiving people about how intelligent it is.
youtube
AI Governance
2025-06-16T10:3…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxFK-OtQoigv0AeqSF4AaABAg.AJQjQlNFrqVAJRirQfUvrF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugw6-ctKmxFl2vgITzN4AaABAg.AJQjAZcyrhLAKIpN-Rlq8A","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw3rLDRD9DTpAj0nh14AaABAg.AJQioU145fnAJQnd6K7EJJ","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugw3rLDRD9DTpAj0nh14AaABAg.AJQioU145fnAJQo9P8zCkn","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_Ugz4d2xqhVt25rh6WP54AaABAg.AJQhTmQcVuBAJQqe073iYR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugz4d2xqhVt25rh6WP54AaABAg.AJQhTmQcVuBAJRXwTw03JZ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzAKBuGZNREMHEBqaB4AaABAg.AJQhKNb57itAJQnwo_j_D_","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzAKBuGZNREMHEBqaB4AaABAg.AJQhKNb57itAJQodgDg30a","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgzAKBuGZNREMHEBqaB4AaABAg.AJQhKNb57itAJQx4lBRaTz","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgxfGhMYlUZfPvzH49J4AaABAg.AJQgLVO76cIAJTbsHxh-vY","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]