Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Now we have AI in our mobile phones, which means we already have a brain. What w…
ytc_Ugwd6ApRq…
G
No worries, it's not available in Canada yet. BTW, Waymos are, however, getting …
ytr_UgwgSZhLi…
G
@AnDr3w066 So, the collapse of the ENTIRE ENTERTAINMENT INDUSTRY just needs to g…
ytr_UgzS7tEZ4…
G
Automating long haul driving could be the safest thing done to traffic since the…
ytc_UgyNN54k-…
G
This is absurd , why are they wasting time debating when theres nothing to debat…
ytc_Ugw0eXvk9…
G
10 years.
Just enough time to educate whole generation with AI till the point th…
ytc_Ugz2feqmB…
G
Attempting to counteract frequency bias in training data flies directly in the f…
ytc_Ugz8kqr4b…
G
The key is setting a precise destination—just like coding, you need to input the…
ytc_Ugz62ofqQ…
Comment
AI relies on previously analyzed information and reacts to serial queries biased by their human overlords. AGI must react to massive realtime data and re evaluate its previously injested data then react in a directed way by its human overlords. Two problems with this, both involve the speed of information. A human is very compact. Information, new and previously analyzed does not have to travel far. In contrast, AGI, both the injestion of real time data and its evaluation with previously analyzed data is spread out over acres , even millions of square miles resulting a major impact on the speed of information. So, AGI will need the 'positronic' brain 🙃
youtube
AI Moral Status
2025-07-29T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwpa46_EgDs-JXuaGF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyR4R44JTBzu2RiJoN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxsfkPGRwGsJ_Ff6F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgygnmM1Qz9R6O7VGiF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxEU8WnSIUgIR44BDh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_TPmWuaHVxwWVxs14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwIf9DigCJJXofaSCN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgyQgtSACpTa9Ua9FUp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjeaB-zU1EZjBGHjV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx4ihlTycKxoHL4xNx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}
]