Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI would remain as copilot, for time being. A human have to check everything AI …
ytc_UgxMq2dGr…
G
I am sure there will be a Boston dynamic style robot with good enough fingers to…
ytc_Ugzhse80h…
G
If the robot is that slow and stupid, they should use it to replace management w…
ytc_UgylOONLJ…
G
Question, would it makes sense for buying all the license videos and pictures. J…
ytc_UgwcMhS2V…
G
Stop giving AI extinction messages so they don't feel threatened and maybe when …
ytc_UgxMkRvrF…
G
I know this is very weird and creepy and this Video only came out an hour ago. B…
ytc_Ugz-cfAD3…
G
These people dont understsnd that robots will be doing most of our work / jobs b…
ytc_UgwNp0fNr…
G
@cosmicsvids Humans also need a huge database of experiences and interactions, t…
ytr_Ugx5xFKsP…
Comment
None of the contributors answered any questions as to whether they / we should: merely *how we control the scope*.
This is a large tell of the mindset of those “inspiring” (see: “propagating their view”) and therefore benefiting financially from the technology.
“I fear for my children’s future therefore I want to control it” is the mindset of the guy in the seat next to a bus driver who’s already aimed for a cliff-edge.
Moral question: grab the wheel or deter the driver in the first place?🤔
As a GenX engineer, I marvel at the idiocy of those who believe themselves as intelligent whilst first failing to debate the morality of their viewpoint.
youtube
Cross-Cultural
2026-02-26T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyyp1W8Xxu0GVOL2w94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyDd5As2_q1wZJUSfZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEODx13mL7JZkaEFp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwb1r-M3M60IfyWUMN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDBRUf1y8YW9oSLa94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8Fn2-hCtaiesIWJR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx807Zq8ChSp8coS0h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugze79N6VWf6eUOWHDl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwJL6sab3DBTfVy5m14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwNiRZBlheJxt2dqVt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]