Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly if i was a doctor i would be using chatgpt just to be sure I'm doing ev…
ytc_Ugz-qwhBv…
G
You should do a video on Anti AI Ais next. How people make a decentralized AI sy…
ytc_UgzCNgHp1…
G
It's not difficult to understand how crushing it must be to have the fruits of y…
ytc_UgwQrSYMx…
G
Did anyone catch AL say in 20 years robots will learn and know how to do every h…
ytc_Ugzth_vHV…
G
Pov this man is smarter then ia 💀 that how good is our achivments in robot memor…
ytc_UgzGuBOzG…
G
39:17 the context windows are not truly big enough for a whole book. Large Langu…
ytc_UgzFrXhXM…
G
My man taking notes so he can finally find a way with AI to make you hit that su…
ytc_UgwG3FVdQ…
G
15:30 after the suggestion of its consciousness Alex asked if its reaction was t…
ytc_UgztXu9z6…
Comment
This guy may be right on his prediction but the timeline is full of shit, maybe 30-40 year not 4-5 year. AI especially super intelligent need massive amount of energy and infrastructure to run these cost are enormous. Robot won't be able to replace human relax and dexterity over billion years of evolution. The software may be there but the challenge to build hardware to match or exceed human at a price point that is cheaper to hire a human is still a while off.
e
youtube
AI Governance
2026-03-18T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzk8WM6xxB5MNhuPBd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxSme7J1XVuYPGKSzt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJeUpzaDi997Rb9YJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw8B_otFoJBkVh_COx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKlovs6cD9-Z3lrW94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxTtVzeeAQngRwjNTx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyKDauE0224Q9u7JoZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzP4Hm3JzxAWekAk4x4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwrg1_dLENOjkdDJyp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxtF8G42jIt_VdHaLN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]