Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well without AI the economy will eventually recover. With AI, you're just count…
ytr_UgyYlJxCG…
G
You can't run a massive llm in a closed system, it would require a datacenter on…
rdc_ohtsab6
G
The problem is not AI, the problem is hidden in the nuclear weapons hidden in th…
ytc_UgzPVZag2…
G
It’s not just ai that’s the problem- what we CURRENTLY have is also causing huge…
ytc_UgxlAz64_…
G
So Overall Self Driving Cars are bad, better mass transit is the actual solution…
ytc_UgyKwejlb…
G
I suggest we stop calling LLMs "AI", because they aren't.
Many non-technical peo…
ytc_UgzfeyOhl…
G
This feels so much like „I Robot“ and I am turning into Spooner, buying CDs, abo…
ytc_UgyP2nTH3…
G
I use AI all the time. It's insane how good it is. It's a force multiplier for…
ytc_Ugy0yn94m…
Comment
This video should be made at least 1 year ago. Every topic (without specific releases like GPT-5), every problem, AI Slop, AI excuse to fire people, AI bubble, LLM as dead end...
All of this was known last year and today it's just in mainstream media.
All AI gurus lost their credentials between 2024 and 2025 especially with those insane number of billions that they will invest in to AI...
AI that still can't bring revenue on acceptable level to compensate costs of use but market still invest in to unproven technology.
In 2023 a lot of experts said that cost are much higher than revenue and in 2024 I think only media didn't knew it's a growing bubble.
We should stop taking about "what AI will be able to do", we should start talking what can't, how much does it cost in total and triple check every AI guru word.
youtube
AI Responsibility
2025-09-30T14:1…
♥ 62
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyzT_NQe0bhTQkvlvJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz0tn6IyhPkq8H0wPt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy7Fp87szwgyRAkAT14AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugx8ne7XJR5ienUB_Zd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyv9NSVyokCGONd4Ll4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwWq1juerZS8PnNulp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugz4fM64RHADv8u-Rd94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzy3WPcZp2oWgx95pp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgytCwYRlrKREc0DGgF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxwE2WEEY16fcdlCJt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"}
]