Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Man Terminator really locked in the cultural idea that a Superintelligent AI wou…
ytc_UgwTRvZUr…
G
I wouldn't. Every other software engineer, including me, knows the AI isn't sent…
ytr_Ugz6apAms…
G
If you think the main problem with AI is its carbon footprint, you are an absolu…
ytc_UgyGZ1YuA…
G
All of these AI characters feel like they were created by the same joyless HR di…
ytc_UgzHZHOsa…
G
as an AI supporter and enjoyer, i do not approve the actions of those people.…
ytc_UgyEeH6t6…
G
@CollaterlieSisters , not at all, we make bets and educated guesses all the time…
ytr_UgztqE8so…
G
Yeah, our current system doesn't allow normal people to benefit from AI, only t…
ytr_Ugz1CSNMZ…
G
I'm reminded of this one story I told my buddy. How pessimism and worst case men…
ytc_UgxNngpzv…
Comment
When it said that IT didn't recommend bromide but ChatGPT did, it's correct. You were using Gemini. Almost all AI uses GPT as a base but then proprietary software checks, edits, updates, and tailors the GPT response.
So Gemini, Copilot, etc are all unique "minds" using GPT to pull initial info.
youtube
AI Harm Incident
2026-01-21T17:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyX4QH3xYfIz8nCD8t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxUaTUXsgocc1q002J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyf48LsIF2YBM97Ku14AaABAg","responsibility":"society","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzk1sAwe7bs_uJRhcp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwii6P3i1M25QkNu6h4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz-ES6gUw_miIVx5ul4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzSHYN1wKC080rjUnR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-_oEOLlMYpQzNgzF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzO5i1iZTdjDmALqj14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugz9Ab5ty6J8s-C9_Kl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]