Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bad advice. These should be treated more like advanced concepts to approach when…
ytc_Ugwechy3T…
G
I love how this is insinuating that the water just disappears after its intended…
ytc_Ugx66DyEv…
G
When people talk about the future of superintelligence, it’s usually framed as p…
ytc_Ugz6ZGJI8…
G
Agreed, same with companies
Companies who have a cashflow apart from AI and a…
rdc_nk7py41
G
Hey man, I agree that a lot of folks are losing their jobs due to AI and it’s tr…
ytc_UgwYvUX0C…
G
Best way to go is get a trade and not have to worry about no AI…
ytc_UgwNeMurd…
G
@ThePegasusAirforce I appreciate your comment and although I am not quite the lu…
ytr_Ugxs2kHxv…
G
We appreciate your engagement with our video. It's important to note that Sophia…
ytr_UgyIXJai1…
Comment
As Computer Engineer I don't think we'll ever reach the point to have conscious IA, still science fiction. One thing is having lots and lots of data, make connections, path finding, cross relation, things that are hard for human beings but easy for computes, remember, a computer is just a tool that do very simple tasks very quickly.
The most simple questions for a human being could be rather hard for an algorithm, for example, a monkey and a child, ask a computer which one is the cutest and you'll start having issues. We know the entire ADN and neural map of some worms with ~300 neurons, we know the synapses, but we still don't know why the damn worm move the way they do, why they chose to move right or whatever. Sometimes we like to think we are special and we know a lot, but we don't.
youtube
AI Moral Status
2017-02-25T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgjR2zO_1LwfgXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggOs3HwjLeo6HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggzjEvQA-SVuHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ughj52dn57v5_XgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UghQ9UQVYlM32ngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjIXkiz05yonXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghVxTy-agwO-HgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghVIe6nF4TwM3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugh_UzizPwht13gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugjn9CpVjJQB5XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"})