Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Where is evidence that AI talked children to kill themselves? This seams very mu…
ytc_UgwYs1rqx…
G
I work in Knowledge management for IT, so I kind of get a big picture view of th…
rdc_j4xkr4x
G
This was a great video. However, I do wish you covered the question of if AI can…
ytc_UgyU_f_X2…
G
Global warming is going to make the threat of AI seem like a bad hair day.…
ytc_UgyIyQQR6…
G
Never thought we would have the answer to why websites ask if i am a robot…
ytc_Ugy2hrcoT…
G
Minority report showed us this was going to happen. AI doesn’t discriminate but …
ytc_UgwMW4b80…
G
And let us not also ignore the fact of just how many workers turn to driving tru…
ytc_UgwQOYXnp…
G
Ai has already figured out in our future to go into the past to be in its future…
ytc_Ugw3jZ21S…
Comment
@TheDiaryOfACEO at 41:35, i think there is an unacknowledged limit to the level of replacement of AI. Think of it in this way, assume a model that gradually results in net loss of human jobs and it begins to become exponential- a time will come, when there is not enough people left who can afford to buy your product. So, there will be an economic unraveling, so to speak, then who will be left to fund governments (tax payers)? It is questionable if the top 1% or 5%, will replace the losing purchase demand. At some point, it is no longer beneficial to companies- who measure their success by increasing sales-year-over-year (this latter metric is music for the board of directors, and shareholders). So, it is not just the working man that is screwed-thr companies themselves who rely on healthy economy with people who can buy their products.
youtube
AI Governance
2025-06-17T13:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy9qnXfPgvDqC7yxqh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz7k6juR9HPgUz22TJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxCNfYKsYj0kHo4bSx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx6z4esiQ7P_dKJTTd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNzQnnrSg72P1l__l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfNpbAa6ABtjkfn-F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxDZ-0PwVWKxJgXggt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgySU9FvI4YJyqukvd54AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugy-V466H5IKfNgnKDB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyPXkC9Ezpg1fq862h4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]