Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yikes, can we just stop making more advanced AI's? Seriously do really need anyt…
ytc_UgziR6FR8…
G
What is the ultimate goal of AI at last ? If yu watch the movie terminator 3 th…
ytc_UgwJ1EDau…
G
Calling yourself an ai """"""""artist"""""""""" is like seeing someone walk and …
ytc_UgysoChRp…
G
Rick is old, a.i is the future. People are ran through. A.i mimics the most gene…
ytc_UgwdvA1tu…
G
The most terrifying thing about this interview is the realization of how many of…
ytc_Ugy7L7hl5…
G
With YouTube cracking down on AI stuff, I've been using AICarma to keep my conte…
ytc_UgxG_NO0G…
G
"God's plan" is lunatic speak for I don't have the will, intelligence, skill, un…
rdc_d0gn446
G
Let me explain this as someone that actually works in the industry.
People don'…
rdc_mz1j4lv
Comment
the arguments are childish, full of omissions. The statement cannot be taken seriously. Who deals with the information that comes to AI? AI assistant, yes I suppose it will be 99%, especially in public service, service provision, accounting, secretarial, reception .. But without a human decision-making factor whether in business, real estate, public services, services, hotelier, (etc) AI could only function in the form of assistance. Stupid intelligent people and their statements. Such a transition, probably 50 years .. Assistant, yes, in this position you eliminate part of human errors. AI assistant at a police traffic stop, would be really necessary for example ..;)))
youtube
AI Governance
2025-09-04T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyzprI5x8HKSK8HCrB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwwJw_1rF8VNtEQ0GN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyBO0dIRJntGZvmWXt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzx3KCCx0CNIBGxsfF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx83URQ9JOQWPxDDN54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz-f-JPHf-Ivpi-mg54AaABAg","responsibility":"creator","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzrQ3ePkyv0MWeQsRZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxZrlnxAh3c7mOxtKB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz7GzHohk45ymOloRR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxUH9j2eyzM2Lu2zAh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]