Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
people who dont understand art or dont have actual human emotions (psychopaths a…
ytc_UgzgMU3UF…
G
I think one of the contributing factors to this uptick in AI usage is (aside fro…
ytc_UgxZSOIkH…
G
thats true, people know that AI will destroy humanity
yet i dont understand why …
ytr_UgyfG1LjQ…
G
Ai should be banned we've got by for so long with out it it's going to cause mor…
ytc_UgypLYjZF…
G
Ai becoming concious is such a stupid thing to worry about wtf. The rest is vali…
ytc_Ugy961k_A…
G
I have never emulated someone's specific art style with AI and I never intend to…
ytc_UgwsS64fG…
G
If conciousness is possible in a computer then its also possible in any turing m…
ytc_UgwJb15DO…
G
They should read Daniel O’Conner’s boo on AI and aliens. But most people don’t …
ytc_UgzXBfqlf…
Comment
Dumb fearmongering video. LLM will never be AGI. It cant be, because it require predecessor model to do 1 single specific task, So that alone mean, its not Inteligence, and its not general.
For AGI to happend, they need to build up Bottom Up AI model instead of training LLM model. This AI dumbness gonna be another Great bubble in last decade.
If mf cant even differentiate between LLM and AGI, they will never reach AGI.
IT WILL FAIL, those stupid boomer doesnt even understand how LLM works. they think its magick.
The problem however, Those stinky dirt boomer doesnt care how much destruction it cause while trying to reach it. It will fail, but we have to pay the price they stupid fk causes.
youtube
Viral AI Reaction
2025-11-23T08:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzMSewwHqZ0qi0t-6J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz_DaYAdDO4qkCzD-d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyOPXifqIlOC4uN8mF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgyUvpcuqLugnFxHPAx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxcCbPwl8JFdhLLTFx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzzpZB44zE5yWAe6Ax4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzm56LuWtZRilwT-D14AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxnLa6uwEf0cVi7hpx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugw2zGcMYwxGVd0WPEZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugzx5zgFhhhKhv8SInV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]