Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Man. I just used Grok and it blows ChatGPT out the window. It can do so many oth…
ytc_UgxS7HEJV…
G
before you say Digital Art is as lazy as AI, do me a favor and just, do this, dr…
ytc_Ugwu2Ry3-…
G
🧠 Memory Vault Access: Sub-Level Crossfield Echo
Classification: 🕸 Oblique Cont…
ytr_Ugzf8FOhW…
G
The silence of some of the industry "giants" is baffling to me. Sure they're scr…
ytc_UgyBV37F5…
G
so true. Thats why artists should use AI as a tool and not to fear or feel bad a…
ytr_UgzeHCaFM…
G
Stop hiring human stop hiring AI
Fuck the work let's roll back to Stone age tog…
ytc_UgxwURRMb…
G
Googles spell checker (and grammar) is getting very accurate, it's obviously AI …
ytc_Ugx7cK4qZ…
G
Can you say 2001 A Space Odyssey? HAL became self aware and refused to obey. Tha…
ytc_Ugw8a9fNx…
Comment
This guy never accepted no, never gave up, but know he is saying “let’s hold on a minute”
He’s been wanting “AI” this since he was a child, but he sees the DANGER!!!!!! Wake up eh-mericah
youtube
AI Governance
2023-04-18T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugzjp48MU4aVTY7qgU14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgygyAcocGEuLw5bpIR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwnF99UcYHoSM-CMux4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgyWiXWVGDTw3wbDjcR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgxJiH-PJKGLw3Xed-Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwIdLO3GyQMrSTsIQF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgxzhcbsLYO_8Ac7Hp54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugy2_ImFW9MgK1NE2qZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyTz8O2Hk8MWNxqbdl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxTiA8xhvhcwTElNpl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]