Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Holy shit
1st:i have killed a lot of ai
2nd:i cant listen to this music without:…
ytc_UgxhqBVxS…
G
So as an artist I'm fine with AI existing, and I'm fine with AI using my work fo…
ytc_Ugxub6rv9…
G
Already a disaster. I just looked up “ current status of troops deployed to Amer…
ytc_UgxzCi0Tx…
G
This made so much sense to me and seemed like it was going in an interesting dir…
ytc_UgypAC-N_…
G
10:15 Crazy that we says that but we kill billions of land animals and trillions…
ytc_Ugz6B9pyI…
G
Hey Charlie, sorry to have a nuanced opinion:
"it takes talent to use camera" i…
ytc_UgyNTl3SF…
G
Ah so these people were inspired to draw something similar to ai? Stinks of iron…
ytc_UgzkKrQUK…
G
2038 is the current time estimate of the AI takeover, why? The year 2038 problem…
ytc_Ugx1y7u04…
Comment
When you are interviewing him you thought order was, I have spoken to other people and they say some jobs are safe, some say focusing on something with AI will make it safe for some, if I am "some" I will capitalize it and make myself safe.... Chain of thought you still cling on the hope. WHILE the guy who are interviewing just told you the truth and he is not being fatalist, he just told you the inevitable, and yet you will go back to the "hope" one month later if you see my comment you will want to ignore it and go back to Hope. He is right, this is why humanity will fail, there is no common goal, it's always the benefit of some and the loss of most, instead of quality of life it's benefit of it, he is correct and instead of all looking for this we will all go to the "let hope someone makes it OK" meanwhile I will see how I can survive and hope...
youtube
AI Governance
2025-10-25T18:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugyg6U2aA6M7auQodnR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgxKQUiPErPeb-aNtVh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugyi1GRimpd9fW7aX_B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugw0q8J-LjqfP3vWEat4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz9TMACLxJaV25b5Hd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgzTaQ3KyBMDLOO2eaF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyJnueEQ8KzVEWoXKF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxbW2k8OlXDWmGoXP54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},{"id":"ytc_UgwElWTxLgiPk9MhxyF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwIz96HpTIk1y1cIA54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"]}