Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
While I still support for the A.I, I understand your concern as an artist there.…
ytc_UgycViPtQ…
G
This is ridiculous that something like this has been made. It is designed to tak…
ytc_UgivtIdgV…
G
The fundamentally wrong thought here is that it implies no one will cater to the…
ytc_UgwiH54Mv…
G
The great thing about cybersecurity is that it is ultimately cat and mouse. You …
ytc_UgyeG8DnG…
G
What the fuck is an AI artist?
He just typed A bunch of words into a computer
I …
ytc_Ugz_5X5qZ…
G
Ngl but it was a great idea for a select one of them to make it a bit easier for…
ytc_UgznDPBYK…
G
The fact that they already know ai will surpass us and they keep pushing. Get r…
ytc_UgygEEW8V…
G
I honestly enjoyed being the guinea pig.
It was inevitable and I dipped myself i…
rdc_ohkuc9m
Comment
Guy asks @0:55 - A question to the company. Why do you need everything automated?
2 of the BIGGEST reasons.
1 - It's CHEAPER. Like overall Stoooopidly cheaper. The upfront cost is a lot. But overall. Down the line. It's a HUGE cost savings.
2 - Goes back to being cheaper. Less likely to have to pay out for injuries, Benefits, WSIB in Canada (don't know what you guys have for injury in the states), if an employer doesn't have to pay for ANY of that. Why would he pay for a human? I work in I.T... As long as robots aren't building themselves. I'm safe. But until that time lol... No job is safe.
youtube
AI Jobs
2025-06-03T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxhC0ifU01RP8Jm9xF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz0S_Dir_ZGVbqPBFN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzYLrP91cSpGGlniEd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyNyb7E8MP_qTxPxMd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwo33ZOT3njB-hljhV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkxLt6fO2P8xG3V8B4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBihI0sG4W0wStmn54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyJbyJ4jp3ldybr44h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugzu4bDn9BW_nopBBEd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6lC_OnEo1fdeI8bp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]