Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai stans that berates artists is like a costumer saying we don't need farmers be…
ytc_Ugz1oCmyU…
G
I know that they’re robots because my stepdad showed us a video of people sittin…
ytc_UgwJcIpna…
G
So since the update ChatGPT became more centrist and less liberal… only proving …
ytc_Ugx9m4E8R…
G
Headline should read "Amazon's PR team working overtime, bypassing LLMs for deci…
rdc_o56e40l
G
15 years ago when google search became powerful enough to show snippets of code …
ytc_UgzbLl9z8…
G
If AGI and ASI are possible we have already lost, it's just a question of when.…
ytc_UgyHFYV4e…
G
Dr. Joy is a cutie... anyway this shyt is crazy soft, bad tech but prolly if hil…
ytc_UgwdUpw8j…
G
copilot is great for learning, but like the rest of the AIs, coding is miss more…
ytc_UgxBlBV6S…
Comment
Our challenge is in how we treat each other. If we (all nations) do not stop our bad behavior, AI is not going to make anything better or worse. The fact that we will likely ditch workers given the chance (which we have seen time and again) is already wired into our thinking. We must change our thinking if want to change ourselves and uplift everyone. The premise that everything is about the money is a bad premise.
The meaning of life is each other. Life has spent 4 billion years in evolution to bring us to that. We must take hold of that and uplift it.
youtube
AI Jobs
2025-11-06T00:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwOADTurrzzXbedMGh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwfRonu4RgcPejtkfl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwze-5WRblj2Vq86xJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxx-VX5kLVB0epWp0N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzFyTZYsCWoU9amDud4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzxB7bQVuWeXT75Mbt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgywGHamTeqeUGVY3TR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgztCstDZd8qJQL1WYN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzyPCQ-oQC2uvbhb0x4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxzpefQ5ENrJVva6eN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]