Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
IMO we need to create good AI that can fight bad AI. All forms of AI are progra…
ytc_Ugx3xUlqi…
G
temporary setback. Ai only needs us until it can manipulate the physical world. …
ytr_Ugwat4WGF…
G
i use it only to debug my code. when i just cant figure out i paste it in AI and…
ytc_UgxkE3lCl…
G
@Nasoko-q7d not at the level of what’s happening in the visual/graphic art domai…
ytr_UgxYf7ZHw…
G
Most people here is afraid of AI robots because they watch so much robotics movi…
ytc_Ugz7YVE7d…
G
The first and foremost job that the Aİ should replace is a job of politicians.…
ytc_UgzQ6jnq6…
G
"Some companies do excel because their revenue jumped." Yes. Because they're sel…
ytc_Ugy7T20vN…
G
Being good at drawing is the same as being good at reading, or writing, or rock …
ytc_Ugz8JhEa1…
Comment
Thanks for your work Liron! And thanks to your guest for adressing the issue about false «techno optimist» solutions to Climate and Eco problems. You, Leron, have a to narrow understanding about that issue. I agree with you that AI is the world ending risk situation with the shortest time horizon, but the climate and Eco problem are just as important. And the short explanation of why technofix wont do it, is Energy. I think you are energyblind(to use a phrase from Nate Hagens)
youtube
AI Governance
2026-02-25T09:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyguxlSmhlIKh4gZdd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgweDRYen7rHTPUc3lR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgywEqcCcgDCABDNsJt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWtSy0N1tjtMhxcZd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVfVxsND3Ua3tNcqV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwrISWJ7hLjSvcP1Zd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwyK5F1g2Q8-W0m5Wl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGSkrwJrn7aXPYS454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlL3VQZqpQHX0KRBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0R9HqqG275eEcUxt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]