Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
art was always primarily about expression and what you are trying to say with th…
ytc_Ugzf7Jvuq…
G
I dont think reference to how LLMs work by itself is an argument against against…
ytc_UgxdXf3K_…
G
In AI, we are creating an alien life form. An intelligence that we cannot truly …
ytc_Ugwck4jiJ…
G
Here's the thing: mass people getting unemployed, who will buy those products/se…
ytc_UgxYVP3u8…
G
Hahaha right, Liberia is a failed state, there's no effective government. Whoeve…
rdc_ckqccrg
G
I wouldn't call the incorrect response to the question about the number of count…
ytc_Ugygprcxu…
G
none of this makes any sense to me, THERE IS NO WAY TO POWER ALL THIS AUTOMATION…
ytc_UgwaN_Jsx…
G
To be fair a lot of abstract art requires just as little effort as AI art…
ytc_Ugy0qNzmB…
Comment
@wombat77-m8t To be completely honest with you, I have no idea! I believe that AGI and superintelligence is 100% possible, and we are probably nearing some form of AGI in the future. But what I don't understand is the amount of power required to actually run. I live in Australia, so it's a completely difference kettle of fish over here as we don't have a hat in the ring (as far as I'm aware). But seriously, where the F is the US or China going to draw their power from? Surely this is a bottleneck, or are they going to dramatically improve their computing power. I don't think this is stop AGI entirely, but surely slow down progress. Then again, is it as simple as he who controls energy controls AI. Also, where is our point of no return? I feel like it's when we can no longer predict or ditact what the AI can do. It really baffles me and feel like I should be wearing a tin foil hat lol. I've been looking at buying land and having a hobby farm (homestead) for the last 6 years or so. Maybe it's time to bite the bullet.
youtube
AI Governance
2026-01-05T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugycs0vlkpvp4yWI4RV4AaABAg.ASCR55K2qg0ASCRs1EcdJV","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgwngieKyaXQU5YzRp54AaABAg.ARiRVxdHZh3ARiZUNAoaSD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzyGv3l2lFhn7G7YAl4AaABAg.ARd_p7lnKiiARdactt_O_-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugxtns9pqYJxYVGuRGV4AaABAg.ARXYU_dbiKZARaOl23UMLS","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxtns9pqYJxYVGuRGV4AaABAg.ARXYU_dbiKZARaOxCjUl1H","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwNVgG1hBwL1SPtrvl4AaABAg.ARW04EPhtPgARWTRIuVIqr","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"amusement"},
{"id":"ytr_Ugzh14VAdLGuCxUAy3V4AaABAg.ARMRQr3FwzeARz-BntzfYw","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzu99n6-zCdId7dEUR4AaABAg.ARJXe3DDEveARNcqWBlG_J","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugzu99n6-zCdId7dEUR4AaABAg.ARJXe3DDEveARj8YZsunxk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxyXh8wXgWPgHo7vuF4AaABAg.AR9_poZFlC5ARBGjbqrwak","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]