Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai has no place in art, art is a human thing, the expression of each individual,…
ytc_Ugwp6j_g3…
G
Maybe the most important global government agencies have to be AI safety to prot…
ytc_Ugw0PHyQd…
G
What you dont understand these programs are tiny compared to companys like Googl…
ytr_UgxXWL40Q…
G
Why would a robot need to aim down sights? That's a human handicap due to how ou…
ytc_UgxHptsLF…
G
What about Waymo only operating in the City, where crashes Are much more likely …
ytr_Ugw6lBgVH…
G
I really fucking hope America follows, deep down I know they're too goddamn gree…
rdc_fnxkmb4
G
Imo these are not emotions as understood by people.
AI is actually not a new th…
ytc_Ugy-B8QM8…
G
Fair enough.... There'd be more crashes and aircraft failures if an aircraft ins…
ytr_UgxeOsdXT…
Comment
So who designed these automated systems? A person and now the question is what did they input into the algorithm? The history of America has proven racism exists and now you have to examine these designs who were made by who, people! Sadly but true their are still those people around who control these industries who keep racism alive if not why is it still an issue in the 21st Century? Just my take on it.
youtube
2022-03-19T23:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwGMdpyoG5x8ucG5Lh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzmCezwNLwg0MZa6Zx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugyr5oPe115ya_Xttct4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4P880T1ideZAZU2t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwjaraUkGByesfkEYB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwsc7P3SzTZSp3_es94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgyR-ChyFyv8WoNIG6J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjGOqr3KwZoEqoWKV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgywhXf5t_euZC17QCZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugznf0WZahOaEh0yXtR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]