Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think they're desperate to make AI viable and not be a complete money sink. Ev…
ytc_UgyL7y2IB…
G
@levacarvalho Yes. Art will become a hobby but the professional work will be don…
ytr_UgyRWS-aX…
G
18:21 you can provide an LLM with a kernel that is proto-sapient, categorically …
ytc_UgxwgxRxL…
G
These are Pebots because people if we mix it with robot it will be Pebots…
ytc_UgywUvXLM…
G
I don’t trust self driving cars yet either, but I trust human drivers even less.…
rdc_nsz0xob
G
14:56 let me clear that up, the statement "AI is inevitable" is true in literall…
ytc_UgwBpC02E…
G
U know ai should be stopped and any talk on media about it it is a sick world we…
ytc_UgwiTYjRA…
G
Well they need people to build the robots wait robots building robots controlle…
ytc_UgwcbbBkw…
Comment
I am not a fan of LeCunn's however he is spot on in saying we need to gauge the risk of AI against other existential level risks like asteroids. The risk in "someone could use AI to extinct humanity" is not a risk if AI but of humanity.
youtube
AI Governance
2023-08-17T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxT0jzYgY0XdOQ4cqh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyEkCQtq92SLKPlPNl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwLVGaFFl8nCHEepqh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx52BnGLYa6UxbMX294AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxAci_nguooo5v0NRB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyILhQ_KsZ-b-C-Lqx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzDi4kiS-bSe3g-LhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0PFkivatSns4E8xd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxedCS7pDsuymN4QxF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzprZVcmX1iB91yZPp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]