Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Once they have AI and automation, they will no longer need slaves. What do you t…
ytc_UgwRpX-BN…
G
I didn't go into AI/robotics because I didn't want to be partially responsible f…
ytc_UgxRq69Gy…
G
Among all other Star trek references, I couldn't help shake the thought of AI be…
ytc_Ugw9uQtF3…
G
Trust me i was not confused on what i was looking at. Its easy to see its a robo…
ytc_Ugz6TRpKu…
G
For people thinking that the robot answer was some sort of sign of intelligence.…
ytc_Ugwe5KcDU…
G
so its the 'desire to prove something unknown', that a consciousness has and tha…
ytc_Ugyp6HDrK…
G
@Tential1 i used the ai make and help to make my picture look a bit better but …
ytr_UgxBqlXah…
G
"Find something AI can't do" doesn't make sense when AI is smarter than me. Anyt…
ytc_UgzTC2whf…
Comment
Technology Reflects Us — The Ghosts We Build Into the Machine
Thank goodness for this episode — Karen Hao cuts right to the core of tech's most pressing issues. "Technology is a product of human choices" — a powerful reminder that if our values or foundations as individuals are flawed, those flaws will inevitably be reflected in the technologies we create. It's alarming how often this truth is ignored or downplayed by the tech elites shaping the industry. The eagerness to automate everything — often without fully understanding the problem or questioning whether automation is even appropriate — reflects not only a lack of maturity but, at times, a troubling disregard for the human consequences.
youtube
Cross-Cultural
2025-06-29T16:3…
♥ 73
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwM2_DLwCYvoUoy7kt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwHan2GPMhLnzI-Y6F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwUz4VsoT8wraR645t4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxGll7S1sLcqWUDzIV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwj3AFVNCaH4BX5CBF4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwHzuY1I4K3ugvXbVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRSMEyqh9oadRF6ax4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzdHsWrC3roeihu_HF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxyfMVqtcskP03kXOt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxlhz04NETakikw5jt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]