Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Bullshit. We could exchange all inbound call centres with AI TODAY. All you need is a Speech to Text AI, an LLM and a Text to Speech AI. All 3 are already existing in nearly perfect quality. Such a call centre would be there 24/7, would always be professional and polite, would pick the phone up on the first ringtone, but nobody did it. There are still hundreds of thousands of people working in call centres. Which proves, not even what already is technically possible gets also done. Agentic AI isn't possible, not today, not tomorrow, and we do not really know when it will actually be able to take over. What we have to today hangs itself in loops or just does nothing. You can just hand over a project to an agentic AI and expect it to be completed, it simply doesn’t work. Maybe this will change in the next 2 years, I seriously doubt it, because AI isn't actually intelligent. AI is useful as a tool for humans in specialised, narrow environments. Everything else would need true AI, and we don't even know how to create that.
youtube 2025-09-09T13:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwX7SMFpYfklwRRAWR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz3yC2KPn0-2pkP40N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyeIJisHmxl0dkmDsB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyKGKCLoXfiwJaRs914AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwUIzU-ir2jV7uqHR14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz771wicSv1ar7ezWB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyYN70qdCZvo-2s4JB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyDfE13YRSXwrJduvx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwNm2UA3-BWnEd0DON4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugza7AN6M1UtOB6xsBt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"} ]