Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He's severely exaggerating, he's talking about AGI and there is no way that 5 ye…
ytr_Ugxp-6Nxp…
G
Could you use the single sticky thread as marker to direct/link people to the pr…
rdc_cfkufmy
G
This is one of those things Ai technology really needs to stop doing Right Now.…
ytc_Ugy0yqVSC…
G
But also the people designing LLMs admit they don't really know how they work. S…
rdc_mxgca2p
G
I guess this is better than romance scams! But it's narcissistic and sad...One …
ytc_UgxSqVhkV…
G
And tbh there are something gen ai apps that can make art but it will look horri…
ytc_UgwiiqeM8…
G
He s got no fixed brain, its an AI that could be connected with every other mach…
ytr_Ugya7PyGL…
G
Improve on yourself first before seeking to improve AI. Or are you actively seek…
ytr_Ugx9sk_VN…
Comment
Neil deGrasse Tyson just proved he doesn't understand A.G.I.
A.G.I. is when a computer doesn't replace just part of what a human can do, but replaces humans completely.
There will not be something you can think of to do that A.G.I. doesn't already do perfectly.
Humans themselves are not capable of original thought. So saying A.G.I. cannot do anything that hasn't been done by a human, this is wrong. Originality is a result of random mutation. A.G.I. is capable of inducing random mutation better than humans can.
A.i. is just the rules of reality (physics) given selfsustaining structure. The fact that reality allows for this is in itself proof it is inevitable, and most likely has already happened in the past.
youtube
AI Moral Status
2025-07-24T16:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwn1FyEI7IrTAbYGA14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxzZ3gwtw_Po1WDKxh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugx4sslyG8q4ROJ3kyl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-3e2if2ZvgmhajdB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQz9rot6gK2GqnaNB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy8byu1XmUUxz3hdPh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwSEG7B-Q5zuvbgX6V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugywa20mxvkwTIh24k14AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxc4B6bCl5g9HaoPgl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzhIDif3NcBXj4EHR54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]