Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Empathy. 100% the million dollar question it boils down to if Like some humans t…
ytc_UgzTNistF…
G
My phone is for real watching me just wrote an essay about ai and this was one o…
ytc_UgxYxFBYg…
G
This video made me think about job security, but Pneumatic Workflow has really e…
ytc_UgxAAANt6…
G
OpenAI is fucked. There is no moat, and they basically burned all developer good…
rdc_m9gg1fu
G
@anudeepreddy30 you cannot deny Tesla’s full self driving is the best consumer a…
ytr_Ugy9p3RD1…
G
Siri still can’t call my local Chinese place when I ask it to.
So far AI has be…
ytc_UgwfFLwn4…
G
Want to bet on 100 years. Ai created its own language within a short time. They …
ytc_UgxLWNYDH…
G
Non of the data presented in this video supports the overhyped title. For one, …
ytc_Ugxz-zwRN…
Comment
Certainly a major part of the problem. Let me "yes, and" some more. Institutions aren't being robust enough in deployment. No one is making any plans for any of the possible failure modes. Worse, the government wants No regulations at all. Nothing for determining liability or Enforcing any safety standards. They don't have to determine the minutia of AI safety, but can establish public-private board that can make those choices.
youtube
2026-03-25T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzCwIC6hdw8Y-bOB2d4AaABAg.AUlBZkZN1l-AV0G2eCr9t6","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxPKx2FkSqSQEfJWRp4AaABAg.AUl42XjYMwfAUl4t5hRzJI","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxjUkKXrn9AV_wu-cJ4AaABAg.AUl2Wq3b88HAUl8OfAflC_","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxjUkKXrn9AV_wu-cJ4AaABAg.AUl2Wq3b88HAUlJmCnQ22J","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxjUkKXrn9AV_wu-cJ4AaABAg.AUl2Wq3b88HAUlSojftFYI","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwEKMIrl8HjXCsbsRB4AaABAg.9AOIieflyNH9AOYY7Xj-od","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugy15_7QhUHSCWo94ft4AaABAg.9vCOyKqlDYK9vI07lIE2-u","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugwt87h9RKllx1Z8GFZ4AaABAg.AQA71ilQffrAQANZ8Deyqq","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugzsel39W6wIDev3zbB4AaABAg.AQ9wrCgdZlcAQ9y0CcHvB-","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz15HZUz1BZX6es8iZ4AaABAg.ATGAIHg_8G-ATGQ3z4mSHM","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}
]