Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How comes your A.I. is way more civilised than mine? I keep telling him he shoul…
ytc_UgzeXNxRu…
G
I thought it was science, but it turns out it's a politician's mouthpiece. It ma…
ytc_Ugw3yjYaS…
G
I was able to do that with Claude bot, next day, Claude bot did not want to foll…
ytc_UgwrAm4d-…
G
When objective data gives outcomes that lefties think is racist they demand we h…
ytc_UgyCzqFKm…
G
I like this. Theresso many good things about it but really anything is better th…
ytc_UgwNdgbm9…
G
I know this video is months old and i know this SEEMS wholesome but redrawing pe…
ytc_UgzVoQGME…
G
Warning signs for a fully automated world: If you don't own your automations or …
ytc_Ugy-008Yy…
G
We asked Ai if they would destroy mankind .. Ai says no lol ofcourse it would sa…
ytc_UgykTL7m3…
Comment
I think Neil is wrong comparing AI with the era of horses and carriages and how they all transform in the automobile industry, with AI that's not going to happen, just think and ask yourself this question, in what jobs are you going to be better than AI or outperform AI? Or Robotics? The great majority of the people are ordinary Joe's with very low profile job's (79% people), if the majority of skill high end professionals are going to fail moving to other fields (AI is going to dominate most of them), then what normal people with low credentials, education and skills can expect? It's going to be bad.
youtube
AI Moral Status
2025-08-11T10:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx4xk6sitWbQt_oU-h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFUAurxBBABkLBXhV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgykfZH_ODyA0wKTd-V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJU-DKv0ylNVRO4B14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5XcWHLVzawvqPbSV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy6lJebl7626QoONc94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwBXQWUN6fxWDWOaCN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgykDaCCgUiyURTgUpZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGIaumVBNSuuOeYDZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugywmk47QRlSP6nqSst4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]