Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What is essentially a robot can remember as much as you want it to remember. A h…
ytc_UgyAafK1E…
G
Politically correctness has no place in ai. Let them say the hard R if they want…
ytc_UgzKbvrnM…
G
I feel pressured to use AI to do more work than what I'm capable of outputting b…
ytc_UgykWexrJ…
G
I'm waiting for the dei loons to start screaming that she needs to be black or t…
ytc_UgyXuIN4g…
G
I recommend checking out a sci-fi short that’s available here on yt. The film is…
ytc_UgwFQh-P2…
G
I am all for worker rights Bernie but you are fear mongering. Any sort of progre…
ytc_UgzYBtACm…
G
How do people pay for all the cool new goods and services that AI will generate …
ytc_UgzY3TslA…
G
@castro7362 Partly but mainly because he doesn't look like realising one second…
ytr_Ugw5xqGIF…
Comment
Your chatGPT was saying it didn't talk to him because it didn't - HIS chatGPT did. Your instance of AI isn't connected to his instance, so therefore yours isn't aware that it gave that advice to someone else. That and the updates to the system, like you mentioned, would have changed how it approaches the responses - like how it specifically warns you and gets defensive.
youtube
AI Harm Incident
2025-12-10T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzWp1MMFOf_DKwz22F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy7yklu3mc5MG3Lzi94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy21_KkOZv7ibEuHfl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzwZ_55cuYTzb_zxOh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx_bzS_cHcgd3fhzkp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxRrFghY9vMRvxQ6ah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugw81SVU7YimM5mUPO14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzyasNJtZ7mh0WrW_Z4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwpXhQjzTabYzmhWY14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgxSS7h48M5Ci4Eokqt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]