Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These types of AI automation are also super unreliable. It's only a matter of ti…
rdc_my1acsu
G
Does no one remember Skynet from Terminator or Cylons from Battlestar Galactica…
ytc_UgzKTlVzh…
G
About half of all job creation is from companies with 50 or fewer employees. How…
ytc_UgxiS9uMX…
G
But you're the one who selling fear without proof.......I don't even think you c…
ytr_UgxopQboO…
G
YouTube had the audacity to make the AD for this video be an AI text generator t…
ytc_UgwS8NJwm…
G
I have a spiritual awakening. People could hear my thoughts outside of my skull.…
ytc_UgxYSAUQ7…
G
The presentation is self contradictory. On one side it says 100% yearly turnov…
ytc_UgxUZ0Eal…
G
Ban generative AI already. It disgusts me on every level. It is all theft. It is…
ytc_UgwudWH4Z…
Comment
The fearmongering around chatbots extrapolated to "IS SKYNET HAPPENING OH NOOO" is just so grating; these things aren't AI, and while the damage these algorithms are doing to real people shouldn't be understated... it also shouldn't be overstated. The real harm, right now, is more about people believing they're more capable than they actually are, not some apocalypse.
Like the actual consequences are the eating disorder chat line replacing real people with a chatbot that gave actively harmful advice to a person with an eating disorder, it ain't HAL 9000 - mostly because we're absolutely nowhere near even approaching starting to maybe think of creating actual AI.
youtube
AI Responsibility
2023-06-10T20:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx4v0zmTP3r6pGL4mJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLzes2_F4h8lJNYCZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzz0Vije2qg1K6A8_14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwU3Bj_q0sALCnjxD14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyScwHQV-pCUF4ZSd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy3fQVl8YY9iOuzs054AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXtjTBxh9KZZfIgAR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFXedLNOCoKnoxjWZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyjBHyh4HVtNyZ3KcJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz18aDeYyjGTIx4fx54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]