Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Good episode but I don't like when you switch from fact to a fake story without …
ytc_Ugw_WxR9Q…
G
What a crock of fearmongering shite.
So-called "AI" is still a hypothetical co…
ytc_UgwFbI3ud…
G
Unless AI can be made ethical it should not be developed. Do not support techno…
ytc_UgwwB5uHV…
G
Mustafa was incredibly thought-provoking, blending real-life experience and phil…
ytc_UgwP9ARRT…
G
Humanity has always been doomed. The universe will go cold. The sun will become …
rdc_dxgzt3w
G
If I want to hear two lunatics rambling about crap, there are way closer sources…
ytc_UgyBjocIV…
G
LLMs are garbage, they're not intelligent, a useful tool at most. The bubble is …
ytc_Ugz6gmioF…
G
it's all fake, they aren't watching anyone it's all fake. AI is just made up for…
ytc_UgxD59i3i…
Comment
It's going to go like this: All jobs that are computer science based, not merely jobs that use computer but are manifestly computer driven, will disappear first and almost simultaneously. Next, professional jobs that require advanced knowledge and training will fade more slowly and at an uneven rate. Then many unskilled, manual jobs will start to be eliminated when robots can be afforded but that will not be the norm for a very long time. All of that will take about 40 years and by that time AI itself will have been working on what to do with all of the unemployed humans and will have found an answer. In the meantime we will all almost be wiped out several times by human politicians who are greedy idiots and war loving assholes.
youtube
AI Governance
2026-01-07T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyyHh7H8o8b0ksLtOt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGXRAJ4_3VniZ4kCB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxWHOxZ6hxNqeLP1gx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugym0C1bpO-iM0hCoUt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzJuyNtTs-QbJ8KWup4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw6Fp1QASYuBnBabhl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxJ02rzQq3uDzNHMBF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzE0ehLYnYCWbK5oH14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvgykqFFwtz6m6MC14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyS3-ZrgJ75mQDQ79Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]