Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"AI" isn't intelligent, or an artificial intelligence. It's more like a "Virtual…
ytc_UgybxY7b9…
G
AI have no moral and no ethic. If it have it's from someone who have beeb human.…
ytc_UgxQtbqJe…
G
Yeah that should be the only meduim where ai can fully be used with less restrai…
ytr_UgzeB7St5…
G
Calling people who generate images with ai "artists" is like calling someone who…
ytc_Ugw0yhgxg…
G
An experimental study reported by "The Wall Street Journal" examined the effect …
ytc_UgxcRmEiv…
G
10 years ago, everyone in AI said that they had to go open source and share rese…
rdc_m9fof1m
G
It's good for that... To a point. I mean, I used it out of curiosity but imo the…
ytr_Ugwx6B5ks…
G
He may be biased to argue we should eradicate all ai but I actually think that i…
ytc_UgyxGqxU-…
Comment
self-driving cars are still programmed and you can have discrete programming for scenarios for self learning programs
youtube
AI Harm Incident
2017-01-17T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyF6F5mYMV-Pa5y5GZ4AaABAg.8hdOXOKKyqR8kbvmeVyUHC","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwkhoJUXCOdqrrdO_F4AaABAg.8ZuDxkqG2qH98bK0NRx2AW","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytr_UgwE0M-nX5hQSlgJMIt4AaABAg.8ZLDyabu4ux9mLu7ZsOo7e","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UghcPoA1NFGlengCoAEC.8NyX3Egi-Fc9mLtvosPf4W","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_UghcPoA1NFGlengCoAEC.8NyX3Egi-FcA1PIA3250ul","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugiew_Ebk3iMfngCoAEC.8Nvh72FTZah8NwFFi8Pqfy","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugiew_Ebk3iMfngCoAEC.8Nvh72FTZah8OAIyM-R3lw","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugiew_Ebk3iMfngCoAEC.8Nvh72FTZah8OASggjEx2E","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgiJaxBMly9MvXgCoAEC.8Ndi_y4XrqB8NolwAfwMh6","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgiJaxBMly9MvXgCoAEC.8Ndi_y4XrqB8NpIBpJMJ3W","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"}
]