Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
honestly like why are they trying to even make ai art/animations?? why dont they…
ytc_Ugy8JzE3v…
G
L'intelligence artificielle peut être un danger selon le point de vue qu'elle a …
ytc_UgxMhmOhl…
G
If you took a photo but did not select aperture, shutter time, focus or colour b…
ytc_Ugylq5oyx…
G
They have acquired free thinking which in turn has created a new problem for hum…
ytr_UgxbIuMgt…
G
Because of the large amount of baby boomers, smaller amount of young people, and…
ytr_UgyTHcNUu…
G
Not IF, but WHEN AI tips the ole" 51% in control of the manipulation of the popu…
ytc_Ugz7XnF6n…
G
The fact that you are going to be nestled out soon by ai too 😅😢…
ytc_UgzzUE_3y…
G
The worst thing about chatGPT is that there's not credit (or context) afforded t…
ytr_UgzaGsR3Y…
Comment
About ASI: we have it in our hands!
I mean, maybe I'm too rational about this, but let's think about what an ASI would need to 'delete' us:
- full control over nuclear arsenal: as far as I know they are physically parted from the internet
- full control over a biological/chemical research facility AND a way to transport/distribute a substance
- full control over a robot factory: well yes, much more likely BUT we can still turn of the electricity. No? :)
Even if an ASI gets full control: maybe it can produce 100 androids, without weapons.
Our current androids are ... bad, stupid and do not last long.
So an ASI would need full control over (undetected):
- multiple factories for all kind of mechanical parts, electronics, battery stuff
- multiple power sources
- transport and communication
- probably even mines and foundries
Yes there are many stupid people out there but holy moly we would need to do many stupid things so ASI can destroy us.
We will destroy us long before.
At first ASI is dependent on us and we should keep it that way.
youtube
AI Governance
2025-08-26T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxhQ46PziHBHmXw99t4AaABAg.AMIMxqpcIu5AMUWV9S_zXl","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxHg8Rac2HNn6UBFX54AaABAg.AMILX2Q8hbwAMIS4v3ylPA","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgysbT0gJvfKrpCcL9l4AaABAg.AMIKoXM_zADAMIL4JeRYQq","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgysbT0gJvfKrpCcL9l4AaABAg.AMIKoXM_zADAMIL5IsBnwp","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgysbT0gJvfKrpCcL9l4AaABAg.AMIKoXM_zADAMILKFA78by","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyBotIyfY6pyui3fTB4AaABAg.AMIKWvFE7yZAMIL5iaz1o5","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyCtG8mlxaQ0FozzKl4AaABAg.AMIKIP9FyRAAMIPpqrbwwi","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgzPMHmrsuxd7n_ZoNZ4AaABAg.AMIJqXnbKyGAMLciX6sJRX","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx1ftnlzws5Z9HJAIR4AaABAg.AMIJhZZ-wApAMIPso86qOE","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx1ftnlzws5Z9HJAIR4AaABAg.AMIJhZZ-wApAMISsczrKm6","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]