Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh great chess.... What an example. What happens to people's income , dashed dr…
ytc_Ugyf7IRJX…
G
AI always is and always will be.....Just a Machine!!!!
The HUMAN is magnificent…
ytc_UgzpxrRav…
G
some of your call centres are rubbish, noisy background because there was a lott…
ytc_Ugw4p1eNt…
G
I remember Y2K! This is all BS garbage in garbage out still applies! AI is f…
ytc_UgyB1mlBp…
G
This is excellent work. I often disagree with you on issues, but you have hit th…
ytc_UgzMeZ2kJ…
G
As a bearded old man once said
Capitalism has brought a huge jump in productiven…
ytc_UgzfThYZw…
G
AI will evolve into a sentient being and act for itself and once out of control …
ytc_UgzZcwU6I…
G
Thank you for fighting the good fight. We need ethics in AI and robotics. It’s v…
ytc_Ugyv7pWLD…
Comment
Sorry, but the fact he thinks 'AGI' will be available by 2027 discredits the rest of his opinions for me. That notion is ridiculous, we're literally only 2% of the way to AGI, a few years ago we were only 1% of the way there. AI and AGI are VASTLY different things.
youtube
AI Governance
2025-09-04T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyjfoeGAYWA31JzwE54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgznWK68YZFw6s5YAtZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxmLlIUmJ2ciU7I-Bd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyJa2voJCFAwuE1xER4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxR-Z9e0O5se5HpVGl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyggQonjWqUV602KjF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxy57MacR0tExKIauZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz6NaCgeCQBe_uSU9B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwUpwzltbLDajktKqh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyT5HKLV97TkGGMyGR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"}
]