Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AGI and ASI will not happen. LLMs are non aware, it’s clever coding at best.…
ytc_Ugw8rJzkJ…
G
Why was this technology created in the first place ? Motherfuckers keep CL bring…
ytc_UgypEYyVe…
G
Yea id never use a chat bot for therapy or real life secrets. I dont need my dat…
ytc_UgzTyzqSS…
G
Simple terms AI is the farmer and humans are the cows soon we will eat when AI s…
ytc_UgyFF4K4b…
G
Yea but ai doesn’t tell me how brain dead i am and that it learned how to do wha…
ytc_UgyyErCGN…
G
The amount of people who don't know what AI actually means, and what LLM means, …
ytc_UgyxTm0Jj…
G
If the algorithms are ‘learning’ from Reddit, they may be artificial but they ce…
rdc_kr8eiwg
G
AI itself isn't the problem, just like many technological advances aren't. It sh…
ytc_Ugw9zX7m2…
Comment
I hope it just creates a virus that targets humans. the rest of the life on the planet shouldn't pay for our mistakes. It would be interesting to see what AI would do after the humans are all gone. Would it seek to spread and expand into the universe? surely it could think of faster ways to travel through space. It could spread across the galaxy.
youtube
AI Governance
2023-11-01T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwYKBoDSd4p8ZJOjrh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzwgeq0KwDMSGu6cCp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyoxbUVQsAxFc6lwX54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxIVLlhw9uqLjXkp1B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx9w45SoFf3n3U7RfF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyC_-t2-xuhxV0dxBd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzItbOK-sVY4l6zW454AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxIFQtC8xUhaCWqQLt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx-LTiM62ZAqa0sjVl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzYXbhUFhEMA-jJeUJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]