Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, if they decide on a way to test if an AI is conscious, if the same test ge…
ytc_UgwS1dFHt…
G
I found that Pneumatic Workflow handles workflow automation better than other AI…
ytc_UgzZFfDuB…
G
"How much water do these AI's consume?" ... "Well, it's less than humans." As if…
ytc_UgwT2x9M2…
G
Didn’t read a book on it, but watched it play out in real time when the new boss…
ytr_UgzSCjtK2…
G
I DO NOT WANT WHAT AI BRINGS
I DO NOT WANT WHAT AI TAKES.
She’s right.…
ytc_UgwBHCrG4…
G
If it is that bad, who will consume and pay taxes? Countries will bankrupt? Comp…
ytc_UgxvGkiUT…
G
Zillow senior SDE here! I’ve been working with AI this las couple of months and …
ytc_UgwpeX738…
G
But it’s also very much wrong when saying “not because ai is bad” it is because …
ytr_Ugz2Rzcw3…
Comment
I use to write AI. Blackhat programming is required for what you say. AI is a Black Hat/CIX illusion...a type of computer soup.
youtube
AI Governance
2023-10-09T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugwa_pEu-NGXMtxlHnV4AaABAg.9v5BalJMwKv9ww9P64yNnh","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx5rp2nNO24Wyhx7Ax4AaABAg.9v1xxXNYY7D9y0yIbtTW9C","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzUKGoRDmhtLT1mobN4AaABAg.9urNVOYwvOP9urbnbkmUu1","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgxSbuKA4gtHjZ8f25x4AaABAg.9upGYb5_YkE9wRXvjqeEe2","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugz6anPoXkBcAS2AAkZ4AaABAg.9uIicPT3p_M9uKuGS--yLQ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx89fmJqE0WW-pQROF4AaABAg.9tVokbG-Yo69tVu-u44pw2","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx89fmJqE0WW-pQROF4AaABAg.9tVokbG-Yo69tVuh2JHdZ5","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugxnh_0ObDi9fP3KkVx4AaABAg.9sqYTNFSDam9vcZfc0WONZ","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugy1GC0FuB6IksRJiAx4AaABAg.9smmNqIQPG89smqX-LAUjk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugyof9gBmUYi_Sin8jh4AaABAg.9shyiTPHsh39shzOyjaSJt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]