Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Shitty stupid no meaning song, song might be sounds good but ai is fucking up wi…
ytc_UgyalKQ0b…
G
@A1Authority So? If you're on the better weed then you tell me. How is it bad to…
ytr_UgyD9liWX…
G
Thank you for sharing your perspective on wisdom and the relationship between hu…
ytr_Ugx20wk6R…
G
The question is "Will humanity put more value on AI than on humans regarding cri…
ytc_UgznPoK-W…
G
this is dumb asf y would you talk to a robot if this is the lame ass conversatio…
ytc_UgwMjMg2L…
G
II. Of course, AI will revolutionize so many things. However, I really don't thi…
ytc_Ugz9sU_WF…
G
I’m just gonna point out what seems glaringly obvious, we have threat after thre…
ytc_UgyW497hy…
G
Robotaxi is one big gamble form musk, and it will not be a succes just wait 2-3 …
ytc_UgwujaZwb…
Comment
23 minutes in. Have you all heard of the BBB. Big beautiful bill nightmare? One of the most damning pirates within that is no regulation of AI for at least 10 years. Do we really think that Elon was not happy about the bill that was a destructive narrative for them to have their little bromance break up so we wouldn't be looking where they didn't want us to? Many more have seen it all from day 1 or prior even. I don't think they expected that.
youtube
AI Governance
2025-06-16T19:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwKx_icSnzatGnr0PV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxupzUjuMhcUMQe1Op4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2S8tgxorMhA5faSN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugw1m4tcRnnXtCcTUt14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyBqEF3yuhH_Nz52bN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZw31lJyzKAxmt7ZZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw8ImoefV-Q0ZXzGYx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrditofPN5ydzdM_x4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx31rTPJPdijUU5hih4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzbM5lBS6BejinStox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]