Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sorry, but this is just another step in automation of our societies. I didn't se…
ytr_Ugxxzqeey…
G
Another AI doomsday prophet. If all jobs are gone who is going to pay for stuff?…
ytc_UgyNBub1m…
G
I’m somewhat working on writing a book. AI has helped me come up with character …
rdc_lz5oxg1
G
@Kenmaaaaaaaa Why tf would ChatGPT give a picture of his face as the prompt that…
ytr_UgydhzGwT…
G
As an academic researcher with papers and books published I can fully relate to …
ytc_UgySLb64C…
G
Hard no. Not because it's not a great idea, but because here is the sequence of …
ytc_UgzhsQtRC…
G
I think, until we swap to photonic CPUs, AI will be too vastly limited by energy…
ytc_UgyrvXPfB…
G
It's kinda really disturbing the amount of folks defending AI and the data scrap…
ytc_UgwlAwq6Q…
Comment
The timing of this is wild - AI warnings from the godfather himself.
Makes me reflect on how we approach AI integration in no-code education. Like, we're teaching people to build AI chatbots, recommendation engines, all sorts of intelligent features in Bubble without them needing to understand the underlying complexity.
There's something both exciting and sobering about that. We're lowering barriers at Planet No Code, but Hinton's perspective reminds us these aren't just tools - they're potentially transformative forces.
Do you think democratizing AI through no-code amplifies both the opportunities AND the risks he's talking about?
youtube
AI Governance
2025-06-25T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyS0koOZUd_55Ai5FV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx_DddIL2ntgRW0Iop4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyYlw6uv0cuqsYjE1l4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwbGBmtVR7Qs93HGE54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwc8Bkb0zsA1NXd70t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxbEr3e7kTpASrUUjx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyB7frX0BffdJkBXeF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwMW-qDQnMMxPu84xZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzh1JPrQLUi0M4EtLZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzCKKCtxhVAR0TwuUB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]