Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
does a powertool nail it for you without touching it and do something u didnt wa…
ytr_UgzxIdfHi…
G
>What are we all even gonna do?
Honestly, it's the end game right now. We're…
rdc_n8yjjp0
G
Hey Chris!
What do you think of Roman Yampolskiys prediction of the future of A…
ytc_UgyJZmoq5…
G
These LLM's cant even make an image of a clock displaying a time other than 10:1…
ytc_Ugy1VwgTS…
G
All my homies were talking shit to their AI. Theyll be the first to go in the ro…
ytc_Ugy4Yj3xb…
G
I know this is left field, but this discussion supports the idea that Aliens 👽 a…
ytc_UgyjJiOmd…
G
Comparison to cars being driven manually vs self driving. I think self driving c…
ytc_Ugx5JnSnK…
G
I guess you just don't like Elon. Whilst he has promised self driving, Self dr…
ytc_UgwOGfBQH…
Comment
I do not think that it will be possible to stop what happens in the Future. We could have regulations in the UK, America and the EU, for example, but in other parts of the world, they may not follow the same guidelines. If AI took all of our jobs because of the greed of individuals using AI robots for business and the greed of people selling the robots for driving cars, lorries, aeroplanes, delivery drones or bots, amongst many more jobs, who would pay for the person who has no job? If AI is aimed for use by people, but the people cannot find a job, then who pays for the technology?
youtube
AI Governance
2025-06-19T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyOcOrRWPRTaamoCfV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz_6BF-qrF0d3wK-Xd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxEv_dkZdKmKyNQdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxonceApChfecRu5Sx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz-V0oVp9gOBQh-P3h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHDx1FQ34JIbG_o3l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzn9GQJVKYRowjdIwF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8EDh61b0lrGVreKF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyexkVWruXB2a5eo6h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrmB9BMl4FlZiW_Md4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"}
]