Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you genuinely believe that prior to AI tech companies were even remotely resp…
rdc_n6gkgws
G
So funny that we've bemoaned copyright law for years and now people are cowering…
ytc_Ugye9tC4D…
G
@Zac_Frost Sure, it may not be SkyNet, but it doesn't have to be that sophistica…
ytr_UgziWOaTS…
G
I almost have to stop listening to this podcast because it seems so ridiculous t…
ytc_UgzKGuw0b…
G
Bro cmon... AI AI AI f*** it we need samsung again to be more fun 10× optical we…
ytc_UgwRp8fJv…
G
What are you talking about it's called notebook LM It is a tool Where you can up…
ytc_UgzpQtmrP…
G
It is true but it is from people watching you all the time then feeding it into …
ytc_Ugy9XBI-V…
G
AI safety, and/concentration of power is massive problem. That said this guy sou…
ytc_UgwS4jx0g…
Comment
It takes less than half a brain to drive a car. We have been hearing about driverless cars for a decade now and they are still no where near ready to rolled out under any road conditions.
Something as basic as driving can’t be learnt by a set of code you really expect AGI or a chat bot to replace knowledge workers or end the world? It’s total gaslighting.
It’s a way to get more funding, it’s a way to justify the ridiculous spend.
Even the latest chat bot can’t answer some simple questions properly. AGI is a very very far!
youtube
2026-02-15T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwyRvPEw2SbPojttx94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzyh_4iEd7TydqUL6l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuHc8LJqCDEDeDNK54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwhJF-XhR6ojpwHTL14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxelKkSpX1xw5Cdlmt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy_WBWTRJeLl3KxLP54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgAKJtZJwYHVjKVTl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx5WLouwDAiAj8789F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzIl4_KmSn6WdGV5A14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJG3RwvyuvHytdxNF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]