Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Noticed? your fears are related to Humanlike AI. And besides, "control" is one o…
ytc_Ugw-arl-l…
G
Man, that AI interviewer was something else! 😂 It’s wild to think how many candi…
ytc_UgyXoALVp…
G
@Emile Tenia, for someone who claims to have only a limited knowledge of artific…
ytr_UgyvbOj1s…
G
Please fricking ban ai! 🤬🤬🤬this situation is out of hand ok. I get it YouTube.I …
ytc_UgyKsUxOa…
G
"In Machine Learning, we collect historical data — such as previous transactions…
ytc_UgwlEk4_B…
G
Reasonable explanation: it was probably trying to grab the box he was holding
…
ytc_UgxJCnk5X…
G
You didn't mention the amount of jobs that will be created as a result of AI tec…
ytc_Ugx-KOKji…
G
What folks seem to not care is that AI writes shitty papers. Yeah, AI could "wri…
ytc_UgwS1MgaC…
Comment
myself run AI on local computer then with rules, and well myself a deep thinker, together with my AI, ASI and such can not just be trained just by games where it shall only win, let it play games like Sim City where it must do the best for it is citizen, also that way learn human value big companies just have a competition, who get the must advance AI first, without thinking what can go wrong, where this rules are not put into the core of AI, i mean this should be builded into the core of ASI, if not we can risk big danger, love and respect for life should be something that AI must have into it's core the no damage, put the rules in the Bibel for exampel into the core of ASI, that it can't break, if not it can go very bad 🤔
youtube
Cross-Cultural
2025-09-30T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwSXyfFqgKgMnI6wwx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzmaGD_7lx6FOYeGm54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwAK3edRWz1ffzEjY94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_eykCGEBXSltc39N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgypRlstqB9WzN4eIAh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy3iBu7Xx0zt_WdQO54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw-DvPmeEvVdgsjIxh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxS8xYsJgjty3pxwrh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_nuF31ljgJomNoip4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzlVbRvN7ZfbeHT7jd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]