Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Most academic professions as well as factory work etc will be dissolved by AI...…
ytc_UgzF1ZVfu…
G
@mr.frandy7692 I think ignoring inflection points in the cost of fuel, which we…
ytr_UgwH5YM9u…
G
Ask the most advanced AI to solve the problem of its own carbon footprint: BOOM!…
ytc_UgxQKYaKb…
G
If the majority of the people in the know of developing a technology all agree t…
ytc_Ugwlx12ur…
G
More and more doctors are turning into these automatons that are just there to t…
ytr_Ugy0pivi3…
G
AI will change and control your perception.
Now imagine its potential and dange…
ytc_UgxgVtKmN…
G
Respectfully, the premise of this video is invalid. AI is a tool, just like a ca…
ytc_UgxgZCKGD…
G
At first I was like"FUCK AI". But as soon as I dig deeper into her past tweet, I…
ytc_UgxEYc-nf…
Comment
Well, to say the least, she is throwing all these buzzwords, and I know these tech giants personally from uni stuff, leaving me with even more confusion about tech. In the end, leaving the billion-dollar question, how can people protect their work from being exploited for the sake of training these LLMs, the very fundamental of copyright laws of fair use? All I hear is propaganda, democracy, climate change, and business problems blah blah. So all I want to know is how can we leverage technology to better the economy and increase our livelihood instead of pushing cheap LLMs trained upon real creative work made by humans? How does the Law apply to that when talking about human exploitation in AI?
youtube
Cross-Cultural
2025-07-02T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgzGR43tNxHCGY3gWPN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzuEczQMZuoLUrHf054AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlCqoK8rw7vmErmt14AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzcTO-sL56MkYu0UHd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwaJ1FVwjbLswN1t994AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx11HRVXE9adS6cCKZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwzGqqWbFAmj6lFZj14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZDyNeiNlhI4zYFpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyEN1WEf2LO2dWB3Wl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgysydmiYeXhuzL_zT54AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"})