Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI and robots may be good for very large quantities of whatever, but I want to s…
ytc_Ugx-xwdeZ…
G
Specifically for the policing algorithm the “racism” came from statistics it fla…
ytc_UgxaCFC7g…
G
We are not doomed…. They cant create anything without using the work of real art…
ytc_UgzvuMmPV…
G
> *AI art should be made illegal*
No way in hell you could do that - and IMO thi…
ytr_UgwC67Y0-…
G
Are we in danger? Yes Can we be killed? Yes
List the potential dangers: War
S…
ytc_UgwYmvsvf…
G
This is not a good idea, giving a robot a gun. What are we in Westworld?…
ytc_Ugzn3ce7n…
G
Well thankfully in a job environment there is so much bs that needs to be done. …
ytc_UgzlhwNxh…
G
@TheGAMEKID98 Thanks for stopping by! I appreciate your input, but if I turn it …
ytr_UgwgxilSe…
Comment
why don’t these companies understand that if you employ AI instead of people, there will be no one that can afford the very thing these companies use AI to create. The entire concept of society and economy doesn’t work. I saw Eric Schmidt do an interview and one of the positives of AI is that it can greatly improve your output, but easily double it. But no one asks WHY DO WE NEED THAT?? The economy performed well many times over without AI so why do we even need it.
youtube
AI Governance
2026-01-04T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwLxOYyrlE55Za7U6N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQYNtNhRl6BE8iYqx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnUtLYjHfV0yG3NoR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0Zf9bR-GNO0BNAFh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3lg78Tt1X-QZy7Q94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwwLo7sthu1keu_1DZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugym-2u-BSekngylslN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2W7HKvlO0KzWHz0V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzbDNGNmK0rpZNHCTB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVZRWsiV3bebFLYbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]