Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I do understand. I'm an engineer, but study Graphic design also. I think AI is a…
ytc_UgyJtoWAi…
G
It's not just an algorithm it's also just data slapped onto other data with bui…
ytr_Ugxd-o1es…
G
@NeosGoogleyes but AI art has no creativity behind it all you do is type on a co…
ytr_UgwCbxwCa…
G
Thank you for sharing your thoughts! It's true that many people value human conn…
ytr_UgwQdkr2f…
G
Is it ironic that the first ad i got after watching this video was for Google Ge…
ytc_Ugz01dAZG…
G
Its actually people who program AI that we should be worried about. Computers on…
ytc_Ugy8JniqU…
G
I find it funny and also sad that Peter Joseph talked about technological unempl…
ytc_UgxUa_0rv…
G
Although that's a CGI 😅 but yeah sentient ai is never good in the movies…
ytc_UgzJ_zLrl…
Comment
I love that you put the capitalism issues on the matter very clear. Self-driving cars can be dazzling at first, specially for people who live in cities where AVs aren't a reality yet. I'm studying a degree in AI engineering and the idea of autonomous cars was exciting to me, but now that I'm more critical about it, I know that what we actually need is better infrastructure for people and bikes, along with a robust public transportation system. You did a really good job picturing the hell that we'll be living in if we let tech companies do whatever they want. Great video, keep up the good work :)
youtube
2024-11-18T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxPJF37ql1dphhOvsB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwmFcRyMzN1LW6z_x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxoWATWZ4N7TuKlKqx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeFBAbExLJKCmkTz94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBiG8ECo0sIjRxSIp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzj-4UG8EmgGFRXw514AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugww3HIJVsyjIxNrW_t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwuSBbtcXOGktcLKnl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvyncvYpUDcmFMY5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxG2e905pr2Cf0Lk_h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]