Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He is just afraid that AI will govern the banks and make all money equal this de…
ytc_Ugykz_Kf2…
G
It's not that the creative jobs were "easier" to replace. It's that the white co…
ytc_Ugwah1O-o…
G
AI will fugure that Humans are a danger to the Planet. Mr Smith the Matrix. "You…
ytc_Ugym8HtVa…
G
why would you wanna bare knuckle with a real
rock em' sock em' robot...
🤣🤣🤣🤣🤣…
ytc_UgyBn5WPb…
G
All you need is to show her have the ability to make this facial expression 😮 an…
ytc_UgwclNAKG…
G
Thanks for making this video. As an automotive journalist who drives cars and te…
ytc_UgyfUc7mZ…
G
The insanity of monkeys controlling a vehicle is astronomically higher than what…
ytc_UgzXSm9_j…
G
This man seems to be happy about AI development, look at his face he is happy hi…
ytc_Ugy4tvp8l…
Comment
An unfortunate thing is that companies tend to sacrifice their customers well being for profit.
They say that AI agents can replace many customer facing jobs, but it isn't really true. Most do a poor job at that, and even though the company saves in employees cost, the quality of service they offer is greatly diminished.
We're not at the point where a software, even an AI agent, does a job as good as a human for customer service. I'm sure executives are thrilled because when a machine answers a customers it rarely escalates up to them because they blocked all access to actually talk to someone. They make the change because they can, but it doesn't actually translate in a similar quality of service, and I think it will end up turning against them, as people start to recognize that dealing with these companies no longer has much benefits.
youtube
AI Governance
2025-06-16T19:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwKx_icSnzatGnr0PV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxupzUjuMhcUMQe1Op4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2S8tgxorMhA5faSN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugw1m4tcRnnXtCcTUt14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyBqEF3yuhH_Nz52bN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZw31lJyzKAxmt7ZZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw8ImoefV-Q0ZXzGYx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrditofPN5ydzdM_x4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx31rTPJPdijUU5hih4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbM5lBS6BejinStox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]