Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
With respect to this conversation (roughly 42 minutes in): AI is not the first t…
ytc_UgzZFEbG6…
G
You make a great point! Wisdom indeed often comes from experience and the willin…
ytr_UgwZsQ-J3…
G
The argument of "at least ai art is better than that guy who sold a banana taped…
ytc_UgyY0tl3U…
G
no futuro eles serão programados para amar, por que infelizmente já hoje, não ex…
ytc_UgykuvStr…
G
like real human clinicians or ai clinicians... seeing as everyone is firing peop…
ytc_UgybG9sSB…
G
It's important to keep in mind that AI (in this case a recurrent neural network …
ytc_Ugx5DA20V…
G
The irony of peddling PAID AI SERVICES over community-based volunteer work as “c…
ytc_UgxXUj4Bm…
G
As someone who works in insurance from a data perspective, yes self driving cars…
rdc_dmpekgg
Comment
There are risks, but not as big as Yampolskiy is making out here. We are not going to have "super intelligence", not without mainstream quantum compute. Today's LLMs and AI models are just that, models (vector databases). The biggest risk right now is mass unemployment.
youtube
AI Governance
2025-09-04T11:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzA-WGjfSr91UFqrPV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx612rMbXpxAD4t3jl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXB8MMw5x9ZcvGTLV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw0P4LjmT12WDdu6Ex4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxlOVqUWDeSK6Y1Wj54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwBZApQCXk19_qNTm14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVA4X_dcx354It_FZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyy2QJmthnYIxIG_vV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwmL77DulpfVB-WNVl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKzy4T2JTpSBlcIBJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]