Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a tool for stealing other people’s work. This is not a discussion. It’s li…
ytc_UgyJBnWCd…
G
Customer service is cooked, Verizon wireless app has Ai that’s available when th…
ytc_UgwkVYFVB…
G
those dudes should embrace AI. Movies in recent years suck big time, you only se…
rdc_jj3lg33
G
Why chat GPT and MS Copilot are much more stupid than this doll to the point tha…
ytc_UgzZm_Kz7…
G
This actually freaked me out a bit, (not a lot lol, just some) cause seeing a ro…
ytc_UgztVfnrM…
G
Ai isnt taking jobs as we know tgey are being cut because tgey are cutting cost…
ytc_UgxIRSjst…
G
Ok I think you buried the lead here, the fact that there are predictive crime AI…
ytc_UgxDW2-Lf…
G
He probably didnt ask him those questions because he wanted to see if he could g…
ytr_UgwmDYflg…
Comment
Contrary to popular belief, AI will NOT take over our jobs. Not at least in the next hundred years. Although, when you think about it, that is just three generations away. AI will not be that reliable as long as it's not logic based. Apple already pointed this out (also maybe as an excuse to their own AI woes). Currently, AI are just script-based systems. Not nearly as advanced as most people think when it comes to real world technology applications. Even after a hundred years, AI will most likely take over clerical jobs, & recently automated processes like operating a vehicle. As for when will take over manual labor - same estimate as when our great great grandparents thought "flying cars" will be the mode of mass transport in the year 2000s.
youtube
AI Responsibility
2025-11-24T05:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxBaFgxNGd9xVDx6jl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwjYiNcKwF3YL_npIV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzz45IVFqYabVsOfet4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzIyDnOmzgPCiKgqod4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrHe5s1BS12R97yqZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwTm35G54OzZeyua3Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyBIeo4W6Nc20GZdhd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyPhfLmHSJhgsl2mwp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwJGtBQ5VpgiRKhJSB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyEBi2Bs0YivanUP-h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]