Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I fear the generation of AI though we are proud of technology but what if humañ …
ytc_UgzczRue0…
G
I am so glad I don't have character ai some of you guys are WILD…
ytc_Ugyv3hXfu…
G
It took me longer than I'd like to admit for me to realize that this site isn't …
ytc_UgwvOowoN…
G
Can’t put the ai back in the bottle. We will be the same as ants to humans.…
ytc_UgzpGmlFA…
G
AI will not consume goods and services companies make! No jobs= no monies = no d…
ytc_Ugz2vdkW0…
G
The robot wouldn't be armed so nobody would die. That's the point of using robot…
rdc_f8tluf4
G
Immediately after the "AI isn't even that easy to use" I was hit by an AI art ad…
ytc_Ugz9XkGuC…
G
i dont agree with the beginning when he say so it all comes down to intelligence…
ytc_UgyovZvnJ…
Comment
I think we (society, people, employees) need to start asking for more regulations. I'm starting to feel like unions is going to be the only thing that keep us away from being jobless. But only if we start the discussion and start asking for some assurances.
Ask for a regulation that any AI usage in company must not lead to lays off.
Or the companies should be paying tax for usage of AI. So they lay off people, but there will be more money in system for e.g. some universal income.
And last but not least, ask for progressive tax! The uneven distribution of wealth is bad for society. The few people with most money has very different problems, visions and opinions than most people with less wealth.
We will overcome this together ! :)
youtube
AI Governance
2025-06-17T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwAiGC1TXKVxyNvxGZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbcyXm0zYItadye_N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyRI6Y6WkAi_70Q1nh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHwjplJBxe6H_PSwN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxox7PzZIeaD6kmIRl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1Ha24gD6NZVGjrOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBWSDQMfUL7Ckyb9p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxBWnjvOeu5pxzdg894AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx43vFV8_0-3AfjulF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwe9SEfMvZAxxSVDxx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]