Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He did make up most of it. The Robert McDaniel incident was the only true thing…
ytr_UgzW09Tro…
G
A school that prevents homelessness... they well be able to run their own food c…
ytc_UgwTvSiu0…
G
I called out some people actually using these ai art styles to make crappy memes…
ytc_UgziOn0Cy…
G
That's an interesting thought! Robots like Sophia bring a unique perspective on …
ytr_UgwXqIVLF…
G
10:55 when they talk about ethics they say that "humans are not the most ethical…
ytc_UgzdfEb3c…
G
@zroll11So true! We would for sure feel “Worth-Less” without being productive. …
ytr_Ugwa7IsdI…
G
GE tried fully automated manufacturing before 1990, but failed and eventually mo…
ytc_UgxmY2u-B…
G
if ai takes all of ur jobs we will 100% become like the people in walle, it's ju…
ytc_UgyEO7RJv…
Comment
Worth noting that the Y2K issue was very real and it took tremendous expenditure of capital, both human and monetary, to mitigate the effects. The pro side of the resolution, that is, Tegmark's and Bengio's, advocates for a commensurate expenditure to mitigate AI risk. The big difference of course is that Y2K threatened to destabilize civilization, not destroy it outright. It's one thing for our systems to fail, another for them to actively seek our destruction, as AI might.
youtube
AI Governance
2023-06-27T09:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugz8xg_TAUp50sGdgEh4AaABAg.9rPvpEz94vU9rTx70S0Rsz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwRg0KJemLVpW6t2ex4AaABAg.9rPYZJbr5b39rTlC_DGHlx","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgwRg0KJemLVpW6t2ex4AaABAg.9rPYZJbr5b39rU14m4eHLF","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgwRg0KJemLVpW6t2ex4AaABAg.9rPYZJbr5b39rhCV0ZL18D","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgzuxRs_BKrl6JIqN_B4AaABAg.9rPRpVBUzUW9rPp6KkFuGT","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzuxRs_BKrl6JIqN_B4AaABAg.9rPRpVBUzUW9rU0BErm0H6","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwMSBDoNzy8g3RLmlt4AaABAg.9rPH0awsbg09rj8XXtugv2","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugwp8jS3Ka-LbhS0UCx4AaABAg.9rPEb_4SgMm9rSm6Y2E2Km","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxB7Y9xAPQXXJSV6m94AaABAg.9rPDxNI2VJc9rQ-8TiYDMl","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwkyKlTs7O7KBb2pCV4AaABAg.9rP5RLTN4nr9rR7sXcgyOH","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"}
]