Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only thing I’m afraid of is that tomorrow everyone would need an AI in their…
ytc_UgzPQR9sX…
G
My most best, wise words. (And a thank you note for my fav yt channel noting abo…
ytc_UgyzCq3s5…
G
Hi thanks for broadcasting this, what a blessing
I'm concerned that regulated A…
ytc_UgxqUHXUC…
G
What did you hear? That humans are useless and AI will take over everything or t…
ytc_UgykFQZGf…
G
Get this out of the way first: Matt Calkins talks about "training AI" The word “…
ytc_Ugwq_q5Vv…
G
Absolutely disingenuous. The new jobs created were from profitability opportunit…
ytc_UgxskpTOH…
G
kyu dar rahe hoo evolution is the part of life calculator invention per bhi log …
ytc_Ugyb83rzc…
G
It's worse. At least the AI in the movie didn't encourage user to end his/her li…
ytr_Ugx5EGDPJ…
Comment
The line 'safety is important to us' feels exactly like 'your privacy is important to us'. That means the opposite of what you think it means.
Irobot highlighted how problematic ai can be even if safety protocols are the first priority and inventions are planned. Safety is at best pushed to the least important rung or just marketing speak to sell euphoria and non-regulation to government. There are no solid constraints, protocols, or mechanisms they can easily articulate that prevent catastrophic scenarios atm. It's just 'trust us, we are watching it close and care deeply'. Total bs.
youtube
AI Jobs
2025-11-18T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzs5fgr_9Uz-gftGeV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxrS1dtr1Du_qSc2bR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzs_8-ZLSt8eGbwebt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzOUEZvnFHUpR4nMnV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyUiqAMRGixFIAdZht4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwZZYjAHxYiCmWjBAN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwE2d_UiAr1ixPoK654AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyVHxhw0_j9h_cIQm94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzlKuYUy39yJJA6oVV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx_1x38pV7-4ZghGmF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]