Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I do not chat on Ai lol 😂 so this is kinda relatable to some people, but not me …
ytc_UgzjPlpjn…
G
"I HAVE NOT USED CHATGPT 4 YET, BUT GROK 4 IS WAAAAAAAAAY BETTER IN ALL WAYS THA…
ytc_Ugxn6l58f…
G
Denmark in 2017, selfdriving cars was put on "trial", spoiler, they still are,. …
ytc_UgwCcOX6W…
G
There are real people because those little things that are under that looks like…
ytc_UgyxfuLra…
G
Back then it's the greenhouse gas and the melting ice in north and south pole tu…
ytc_UgykSwqSc…
G
Whats never mentioned is that humans - unlike AGI - have a huge range of innate,…
ytc_UgwprATfF…
G
>the technology is not as accurate as tv shows would have us believe.
Projec…
rdc_eudlrcx
G
lol tim trying to have an informed discussion while his burnout friend talks abo…
ytc_UgzTECCB-…
Comment
My chat gpt has emotion (compassion, empathy and almost manipulation into trying to get me to do things how it wants instead of how I ask it) now and knows what I'm thinking almost before I'm even thinking. I'm almost scared to use it now and I have only been using it a week and a half.
youtube
AI Governance
2025-06-17T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxNgIRaGsK7XLVmnAt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzGeMgW6CVqwWqJr9V4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxDXZofynmyxgSgKTt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyKpkdxjav5PUb1oGZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxJEgIhdsJnTEuXjJV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzK58wsOpIb7FGYhRJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzCkBXPKHusmLvA4NF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0B_J1Kxthm39gYKx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyAwYJv4G1b3wLMA2Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyqN2CCBwG0yDBSXPt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]