Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1:08:00 so an hour into a decent video about AI you derail it to trash Shad for …
ytc_Ugxb4tJ5G…
G
I am a deep learning postdoc and i do not know half of this shit, and you do not…
ytc_UgzJFtkDv…
G
"Hey Google. How can I suck the land dry."
-- "Keep asking me questions and mak…
ytc_Ugxa2t3Fp…
G
Yes, there will always be accidents. In a vehicle with human driver in full cont…
ytc_UgxC3Zui2…
G
This video and the comment section below it is the gathering of angry people who…
ytc_UgwmZAin7…
G
As far as the spaceship, wow you people got a lot to learn you think we’re the o…
ytc_Ugyx51fb0…
G
This is a Tesla issue. Is tech “there” yet? As it was pointed out Elon is playin…
ytr_UgzzcyQ5x…
G
The ONLY solution is that we need a new type of economy, adapted to automated la…
ytc_UgxMyjksd…
Comment
unsupervised AI (UAI) is tricky unlike supervised (constrained) AI. Human intelligence can be seen as an accumelation of unconstrained learning. Machines can be programmed with basic rules to devise trial and error to realise reward and penalty. Imagine a machine is doing this 24/7/365 without the need for breaks, sleeps or holidays! theoretically humans can be extremely inefficient in comparison to an advanced UAI. Another thing, UAI will realise quickly the optimal way for collaboration unlike humans who usually tend to be protective for self interest. The next decades will very exciting, maybe economy will be run by robots, politicians won't be needed and discoveries and Nobel prizes credited to UAI! who knows what will the robotic revolution bring to us :)
youtube
AI Moral Status
2017-02-23T22:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg6uOok2VP5QHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgglKdwIP2tvZ3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgiBj8trrN2T_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghgtcFB4IEzDngCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiWpbxpfu9p6HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj2C_TxSi954HgCoAEC","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UghHl86Xngak0XgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiVMj0Ws70W2HgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgjRmuxIb5d8XHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggGny5a5uCQDHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]