Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No it can’t. The only way to deal with automation is to socialize the automated …
ytc_UgzEb-Hf2…
G
You still have to "make" the art with Midjourney...? It's not just sitting there…
ytc_Ugy36k0We…
G
I have an idea. What if we have specific roads for self driving cars. And those …
ytc_Ugzfy1amW…
G
if they want to properly create a robot `similar to a human they should stop "mo…
ytc_Ugzd3Ix1F…
G
Ai can't harm anybody. Ability to learn is a skill, no need to blame ai for it. …
ytc_UgxCCo3vZ…
G
I wonder if AI would be able to take over a tennis coaching or jobs that involve…
ytc_UgwNfSFyB…
G
I drive a 44T articulated lorry in the UK which has notoriously terrible roads a…
ytc_UgwFeYT3W…
G
The AI companies like OpenAI and Anthropic are trying to encourage more people t…
ytc_UgzYvr541…
Comment
"AI" (rather, the GPUs that power various applications) are immensely useful and important. AI slop and other poor use cases are being supplied at a loss by companies because those companies want to capture the user base and profit off them in the future. The use cases that generate value now are things like biomedical and other types of research. But you don't see as much coverage about those things because the AI slop is what's in everybody's face. The point is that AI is very useful, groundbreaking tech that is largely the result of hardware advances combined with software breakthroughs that happened 10+ years ago. The real breakthrough is that the chips powerful enough to enable all these new use cases for ML and accelerated computing.
youtube
AI Governance
2025-12-29T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgycytbZiBjm7woe6Pl4AaABAg.ARIwPc8d_4LAROp2S-qDF3","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxC3iaKSs2hvm8WpcF4AaABAg.ARIwP3fFuQ3ARJSS2HZmsG","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgygaN88WclkdnjCyTp4AaABAg.ARIq3pppu50ARJMgikCGIA","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgygaN88WclkdnjCyTp4AaABAg.ARIq3pppu50ARKLccNYFa2","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyf37ybCK4CJfK2KAR4AaABAg.ARIpFYljpT6ARJVwUyTdsU","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwvo-JZUmhDodj-Wlx4AaABAg.ARIp39u-zzFARJHttowfNg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugw5DlPNMj6eLisIPEZ4AaABAg.ARIoEfLU1BnARIoxtzFIzr","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzcCGaQ2gavKY6nGAJ4AaABAg.ARInZI-NqQuARJNF1bUkj6","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz-m24EclzeNYmfJL14AaABAg.ARIlMWlDl2hARIpdhi9PFI","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugz-m24EclzeNYmfJL14AaABAg.ARIlMWlDl2hARJa__8tmQc","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]