Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy is an absolute clown. AI cannot and never will be sentient. This interv…
ytc_Ugww-GXge…
G
AI is like a mobile home on cruise-control riding down the high-way while the dr…
ytc_Ugy2ckFZE…
G
This reminds me of people saying Moore's law was dead long before it died. A slo…
rdc_n7hbk3c
G
I know I'm not a robot I'm totally against this b**** I hate her I've seen her a…
ytc_UgzBtWZ2f…
G
AI going rogue is the least problem, AI working as intened will create 99% unemp…
ytc_UgzCLafFK…
G
I always say to treat AI like you treat your annoying coworker that you think is…
ytr_Ugy1tdesN…
G
Summary from ChatGPT 😄
The video transcript discusses the potential benefits an…
ytc_Ugz65mbQ1…
G
This man is correct it will not solve it and especially the big excuses if we do…
ytc_UgxNbrknZ…
Comment
"In 10-20 years we could have superintelligent AI" Maybe yes, but whether it can do people's job is a completely different question. For that, this 'superintelligence' would have to be available at a massive scale in an economically viable manner across societies. If we only have some supercomputers/datacenters which can run this superintelligent system, at a great cost, then it can do some people's jobs, but not everyone's jobs.
youtube
2026-01-30T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxQSxCIdqHlWGmb6qt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyPgGdivaMhq9l1_394AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWRiVk1wCIpwT-CrR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxFzkUbfiwISm_eTiJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwdBkHrK8A8TQdMAOZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwLoY1dYzW_UhbdQT14AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxCn_NH8NjpMvLl1bh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKn5ig44PiswWrrZN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUgr-V-y1FZ6eUU-14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlCS3yVsqbGvW9xJh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]