Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that the only problem with an AI having personhood is the human interact…
ytc_Ugw0hNrvY…
G
Oh my god this almost literally drove me insane. I'd somehow managed to trigger …
rdc_mxhv163
G
Lanes? What the fuck are lanes? It's %90 dirt and gravel roads out where I live …
rdc_d1kwa3j
G
Very good topic. I actually have a good understanding of AI and certain types of…
ytc_UgyKboeUa…
G
I feel like AI could be good, but we as humans are using it wrong and making it …
ytc_UgxmT_W18…
G
AI is the single greatest danger the world has ever seen is my thoughts on the m…
ytc_UgznrPNkW…
G
The AI isn't sentient. But it is capable of many of the same kinds of pattern re…
ytc_Ugzuft_sz…
G
Entangling with society...is the threat.
It's one thing to have Ai, it's anothe…
ytc_UgwF0j24b…
Comment
Okay, but how USEFUL were the answers. A human tasked with coming up with alternative uses will think out logically the process by which an object could be used.
The ai could just dump out "use the pencil to mimic a dog barking". And sure the ai is more creative, but not logical.
youtube
2024-06-10T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxMTpB0nvgS2DgtGNR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyx1GWreI_JcsPQZA94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxH2-4guFteJ7_amMl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyWqTn4jcNCVQvGnsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHaY_kg1LJXq_06Bh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwq2DzQ3D5u9GGVjYJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFv6Rx3Scqcsh1Pr14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtaVADTIvZum1ryBx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyCkxMg-YmJManS3XZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyP9Qa7JAQfNM0Bhkp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]