Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
First, I want to say to anyone who would like to research for themselves. Read t…
ytc_Ugz45iiHY…
G
By 2027, AI is expected to consume between 4.2 billion and 6.6 billion cubic met…
ytc_UgzxKEQ5L…
G
I love doing studies of other artists. I'm not a great artist by any stretch, bu…
ytc_UgwhcRGwx…
G
> Nightshade works by adding artifacts
Hmmm then why not just do say a low and …
ytc_UgwfAFq8G…
G
Nobody can even figure out an actually safe self-driving bicyle and we're gonna …
ytc_Ugy4n_fnU…
G
These AI grifter CEOs are begging to get Luigi'd. We need to rise up and raid th…
ytc_Ugy-InKHL…
G
Venture capital has been deleting jobs despite record profits because the growth…
ytc_UgyAZyfoD…
G
I wonder how long it will be before there are AI consumers (you know when AI bec…
ytc_UgzL5K5Mi…
Comment
Me Sanders, your proposals are adequate. They leave out, however, the mean issue at stake, and that is a constitutional reform directos to the proper development and utilization of AI within a legislated framework of individual and communal rights. This is because AI affects directly the NATURE of human identity and intellectual capacities, thus a fundamental stepping stone on which our species interacts with Mother Nature. Workers councils within factories are OK, but first pass constrainable legislation firmly establishing what you can and cannot do with AI. Stand on Human responsibility as the ultimate guideline. We will then be able to continue with our undeniable ability to make a mess of things, AI or not, but whilst being unable to point the finger to a machine who supposedly is there to think for us.
youtube
AI Jobs
2025-10-31T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxxvPdio-sDtcpGqNZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwPx2TpiGE1ZbFwmbx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxVnqlt9oTRrgKBkEF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxkozUGE9-K0GMD7nZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyU1KZJxTtMitvMaXZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqBa5OA9E-UahOPBp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx-pc7Wk2pKHLJFlRZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxHfMYTOJuZA1OaduR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyYqfAap1pQSbHvYbt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgweDkh7BwBHyWDzoN54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]