Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
RIGHT?? why is nobody talking about that part? a robot predicted that a random g…
ytr_UgwfakyCT…
G
Truckers voted 97% for trump. He was backed by elon, routing driverless tech, el…
ytc_Ugw4JkrIP…
G
That's an interesting perspective! The balance between human creativity and AI's…
ytr_UgyEyl-pL…
G
What a lots of utter nonsense, if that granpa is a father of AI,then we really h…
ytc_Ugzpi_qpk…
G
These discussions on AI often get me to thinking about ants. We are so many magn…
ytc_UgyW4j6kZ…
G
You are so right, i am worried about ai, i am not an artist, i got interested in…
ytc_UgwFxJrx7…
G
I think copilot in vs code or GitHub copilot does a fairly good job of the same …
ytr_UgwohJrk_…
G
I swear we live in the worst timeline. AND this is before we doom ourselves with…
ytc_UgzLNZEIf…
Comment
Working at tech companies you are forced to basically train AI by using it as much as possible in your own setting. Maybe if the Data inputs are paused by the outright refusal to use? Yes, you lose your job faster, but in the end your children have a better world or just have a chance. Don't aide in anyway their abilities? Such a large problem to get people to stop destroying themselves. You see it in the everyday choices people make. Good luck.
youtube
AI Governance
2025-09-04T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw2TTC0bU45v-y6K6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz8-dk0k9v4RrdePHx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy7T1tHqMdt7PV6baV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwShwnhD1kJu-qA9zh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgwRcy_haqMhxlYIyeJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwpCxRGn9hKYw2EgbB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugz1JXKYbcbAkD5CcsZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_UgzSfB5cSFNiq9GKNZJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzdGHhIxX0P109yhx54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyPXgclO0cMNpuGb7l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]