Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
sge said gondi so that prooves AI can already lie no inteagent thing or person w…
ytc_Ugy5ftD8e…
G
Ahhhh, well not really. I got a good amount of knowledge about llms (ais) and Op…
ytc_UgwiRW-to…
G
I understand the ethical dilemma this video is trying to point out, but this is …
ytc_Ugz4tLk_c…
G
Something you missed here:
Once all the companies are fully reliant on the AI s…
ytc_Ugz2LWNUr…
G
The data centers ruin the landscape and use a whole lot of water and energy. Why…
ytc_Ugz93dXPl…
G
"without it's flaws, and imperfection, and emotion"
well shit, that's so heart-…
ytc_UgyxDmU5T…
G
Because a working system is when Governments give an AI company taxpayer's drink…
ytr_UgxTWPT-t…
G
Imagine if Americans, and Soviets, trusted their Government then, like their peo…
ytc_UgyzYQuVQ…
Comment
The canary in the coalmine is simply that nobody seems to adress is the simple truth, the person who termed the term AI got the terminology wrong, it simply is super advanded statistical prediction, not AI(no fault to the guy who coined it, he was not a Linguist). A recent study fed simple but completly unknown(unpublished) but simple(for a mathemetician) math problems to an AI, and error was in the 90% range. For a more visual example, you can take a well documented artist like Picasso, feed it all his inspiration and data, tons of data, but it would never invent cubism on its own - without 'post-invention' data, no matter how many times it iterates.P.S. people are flawed but much more creative than most people think!
youtube
AI Jobs
2026-03-24T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8LvPqZgWNMVsUE1R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz9J4E4DPwym1U7vB94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaQENGYvba-Svl4Ap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgytCtUGea5JW-tfs754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz4FrIPQ9JLMvsoLYB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwbabK3OsC7-_xyot4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxvMzh5GGnRmF5HN514AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzodc73cYqyeUxNk0V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwDQA7_zwCuTl0s5XJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwNU0FeXqqYftFZ5FJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]