Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When people, like Neil here, say AI is going to take jobs but then new jobs will…
ytc_UgwOr18gG…
G
@mk1st
Lidar doesn't work in rain or when its snowing, so that makes it useless…
ytr_UgzKkcth2…
G
It’s not biased data it’s real data like how 13% of the population commits 70% o…
ytc_Ugz1svjKZ…
G
using digital drawing could be easier than traditional art, but it still takes s…
ytc_UgyonR5NH…
G
[***This is what Vladimir Putin expressed too***](https://youtu.be/1CnyqLogH0Y)*…
rdc_gtcu1xi
G
Lol seems he was surprised to hear that ending. Also ppl be talking about books …
ytc_UgwcfMftf…
G
this lowk made me emotional. the creator of Ghibli makes all his animators draw …
ytc_UgxZP7DdG…
G
I checked! Asked the same questions and recived different answears than those in…
ytc_UgwjMR_0W…
Comment
To me it is unethical to NOT develop AI. Why shouldn't i want there to be a robot to grow my crops, build my house, drive my cab, feed my dog, clean my house, regulate my financial market's, perform surgery with milimetrical precision, attend to my every need when i go too old to wipe my own ass?
The "Job Problem" is actually not a problem. if everything in the world would be made with the eficiency of a robot with the knowledge of Google... wouldn't it be better that way? In fact, there would be no need for jobs... everything would be very cheap, and with current development in new energies and materials, in the future it could be almost offensivly cheap to sustain, mantain and entertain a single human being...how can it not be better that way?
Just because there if robots are better that humans at "working" doesnt mean that there wouldn't be no more people working, it means that the people working would be doing it for pleasure or leasure rather than out of the need to sustain their families. Much like the guy that grows tomatoes and oregano just to see it grow, or the lovely lady at the air saloon that cuts your hair for the social interaction, or the painter that paints only because he feels like it.
By definition, we all would be artists, for everything that we would create would be devoited of porpose and practicality because there would be something in the market cheaper and better, and it would only exist just for the sake of existing.
The thing that people most mix is inteligence with personality, wich are 2 very diferent and seperate things... is it ethical to build a robot able to feel emotions? Capable of love, hate, desire, despair, greed and all the other wonderful and horrible things that we are abe to feel is it "wrong"? I dont think so, but rather pointless.
For the very same reason that those are the things that make us humans, are also the exactly same things (or at least some of) that make us imperfect and lousy at doing the things that the robots are great at. I only see very few reasons to do this and are all around companionship, to serve a human need. Much like most of the things that the human race created in fact... So why is this diferent?^
youtube
2014-10-23T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugh5gFy9d-aImngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghJqPLW6Vw7XngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UghU5_fkiifjLXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgjNDSMMJw40vHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggIJNl8KWoVkXgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjXoPJGX56VAHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj2Xvf20e4n83gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjj4tsAFi-2U3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ughk8KTVoPwfoHgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugjl0ZmjjMygnHgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"resignation"}
]