Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't know any AI prompter bro, but as somebody who models on Blockbench as a …
ytc_Ugxjpr_Bm…
G
Yes, AI is a tool, but a tool like lead pipes were in roman times. Just wait a l…
ytc_UgxjUkf5m…
G
The striking advantage of a cybernetic organism—or an AI—is simple: it knows wha…
ytc_UgzWu_-Ap…
G
Why do people hate ai so much.
The reason ai art was even fucking invented was…
ytc_UgzeeEdpT…
G
Oh i found some interesting key factors quite similar with Facebooks Meta ai. S…
ytc_UgxAZBhdi…
G
Typically all copyrighted material is allowed for fair use as in - personal lear…
ytc_UgwQlkZcQ…
G
1:11 no...humans can't keep pattern recognition in similiar ways ai can...bcz it…
ytc_UgwI9JOFU…
G
He said one thing that I can't stop thinking about. He said he never knew anyone…
ytc_UgyUAZPnK…
Comment
LOL no.
I would love to see a debate between Harari and Jaron Lanier. Lanier is close to the industry and sees right through the AI hype, which Harari seems to have been drinking by the bucket. Existential tech doomerism such as Harari's has been cooked up by the very CEOs and programmers who claim their technology is all-powerful and godlike. Neither of those positions is true. Highly recommend "There is no AI", Lanier's article on this subject in the New Yorker. Or "The AI Con" by Hannah and Bender. Or "AI Snake Oil" by Narayanan and Kapoor. All of them make the point that the doomerism is as much BS as the marketing hype itself.
youtube
Viral AI Reaction
2025-06-26T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwdtUqXZfVupfjeLWp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx4XZUIb0HrcJgaJy14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyPlUfprIYW5GAE9vF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzcN1zC7-Q2Suuomo94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugylrs0hmPJ9P2Oi6MJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzVYB3amwO2GduVlpp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzDfNy5chq5_OdBIsV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbFTev0nBz88yilpt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwm96qrvUhZg-uHegN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyj4m-bpFNRBsTtRk94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"})