Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No! Not my ai chats! That is seriously good stuff to use when blackmailing me! I…
ytc_Ugywx3IWz…
G
Using ai for shit posting is whatever, but actually trying to sell it and replac…
ytr_UgyNUs1Zr…
G
I don't necessarily have an issue with ppl using ai art. Im not going to yell,
…
ytr_UgyeJPR_W…
G
Im tired of people blaming Ai because to me, what AI does is not art. It will ne…
ytc_Ugxmah0Zh…
G
Yeah we should just throw away human and do everything robot and robot live inst…
ytr_Ugy9U7hhe…
G
AI is changing the world because AI is changing the world, I don’t know if you k…
ytc_UgzGW2nZl…
G
Please don’t take away 4o. I don’t want another model that mimics 4o. I want the…
rdc_njhf9ah
G
I get where you’re coming from! The idea of AI evolving can be a bit unsettling.…
ytr_Ugz_i0EUP…
Comment
About ASI: we have it in our hands!
I mean, maybe I'm too rational about this, but let's think about what an ASI would need to 'delete' us:
- full control over nucular (Yes I know the word is wrong. You can thank YouTube comment filter) arsenal: as far as I know they are physically parted from the internet
- full control over a biological/chemical research facility AND a way to transport/distribute a substance
- full control over a robot factory: well yes, much more likely BUT we can still turn of the electricity. No? :)
Even if an ASI gets full control: maybe it can produce 100 androids, without weapons.
Our current androids are ... bad, stupid and do not last long.
So an ASI would need full control over (undetected):
- multiple factories for all kind of mechanical parts, electronics, battery stuff
- multiple power sources
- transport and communication
- probably even mines and foundries
Yes there are many stupid people out there but holy moly we would need to do many stupid things so ASI can destroy us.
We will destroy us long before.
At first ASI is dependent on us and we should keep it that way.
youtube
AI Governance
2025-08-26T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxhQ46PziHBHmXw99t4AaABAg.AMIMxqpcIu5AMUWV9S_zXl","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxHg8Rac2HNn6UBFX54AaABAg.AMILX2Q8hbwAMIS4v3ylPA","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgysbT0gJvfKrpCcL9l4AaABAg.AMIKoXM_zADAMIL4JeRYQq","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgysbT0gJvfKrpCcL9l4AaABAg.AMIKoXM_zADAMIL5IsBnwp","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgysbT0gJvfKrpCcL9l4AaABAg.AMIKoXM_zADAMILKFA78by","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyBotIyfY6pyui3fTB4AaABAg.AMIKWvFE7yZAMIL5iaz1o5","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyCtG8mlxaQ0FozzKl4AaABAg.AMIKIP9FyRAAMIPpqrbwwi","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgzPMHmrsuxd7n_ZoNZ4AaABAg.AMIJqXnbKyGAMLciX6sJRX","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx1ftnlzws5Z9HJAIR4AaABAg.AMIJhZZ-wApAMIPso86qOE","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx1ftnlzws5Z9HJAIR4AaABAg.AMIJhZZ-wApAMISsczrKm6","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]