Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art feels so empty. I don’t know why, it just…does. Down to the eyes, the mov…
ytc_UgykIqHeM…
G
The creation of ai art is just so fucking frustrating. Art has so much history a…
ytc_Ugy4YH5tC…
G
Isn't there one thing overlooked. We humans are social beings and will always tr…
ytc_UgwK65UEa…
G
I do not think Ai can wiped off working class, it only automate thinks. Similarl…
ytc_Ugyq-_ERa…
G
“AI” is not something a Human should ever worry about. Only the gullible are wor…
ytc_UgxLZpQLq…
G
skynet! there are countless movies, books, games, and comics that have AI taking…
ytc_UgznvC5i9…
G
God warn us to repent turn from this world government leaders to fix I'll proble…
ytc_UgxGwL_qM…
G
Those saying he's a loon ... why doesn't google want the AI to take the Turing t…
ytc_UgzvSwlN4…
Comment
Is your intuition that AI labs shouldn't be burned down under any circumstances, or is your intuition that it's simply not the case that AI labs are doing anything approaching the circumstances that would warrant being burned down?
In terms of Liron's Wuhan biolab analogy:
Some people would say that no biolabs should be burned down under any circumstances, and so they believe it's wrong to say "if people knew what was happening in that Wuhan lab, they'd burn it down."
Other people would say that there are indeed circumstances where burning down biolabs is reasonable and justified, but that the biolab in Wuhan wasn't doing anything wrong/dangerous enough to qualify, and so for that reason they believe it's wrong to say "if people knew what was happening in that Wuhan lab, they'd burn it down."
Which camp is driving your intuition?
youtube
AI Governance
2025-05-21T16:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugzgm57jBiilQTuMkmh4AaABAg.AIOPyjPGyoVAIOYR606scY","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugzgm57jBiilQTuMkmh4AaABAg.AIOPyjPGyoVAIT3O5xdGNS","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugz_aYY_K34HF9TW2fZ4AaABAg.AIONHwx5AaAAITuh6dONpV","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugy1c50OJPFZdhhWQPN4AaABAg.AIOJiSQ-cjuAIOd_dTaSAU","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugy1c50OJPFZdhhWQPN4AaABAg.AIOJiSQ-cjuAIPgnFYT7Pb","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgxYinNoOGtKsqz9sQx4AaABAg.AIO7TMSpr0tAIOfXttp3Pi","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw5wfFhZmWiGl45Gsh4AaABAg.AIO-dYN-3ZYAKW7q3p90Io","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwoGSxcV6Sb5gGI15V4AaABAg.AINpUmdZ8g1AIO-8VPHV1K","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwoGSxcV6Sb5gGI15V4AaABAg.AINpUmdZ8g1AIO8ihVGSZd","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwoGSxcV6Sb5gGI15V4AaABAg.AINpUmdZ8g1AIOAQipbyzu","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]