Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Google is scared! They don't want to bring AI into the world for liability reaso…
ytc_UgyON2xGz…
G
I'm sorry, there are actual humans who believe ai art is real art? And that it i…
ytc_UgxC4IbPn…
G
When are you going to go back to the education system and at least finish PRE S…
ytc_Ugy8eTu0O…
G
Heard someone else say it so I'm gonna repeat it. We need AI to help us not repl…
ytc_UgwItXq86…
G
I’d love to hear a conversation between famous neuroscientist Antonio Damasio an…
ytc_UgzgUcSsz…
G
I love the part where the Girlfriend AI threatens to blackmail him. Yeah, I need…
ytc_Ugxe8auXE…
G
hmm.. I wish these AI Developers and Programmers get replaced by AI itself, tab …
ytr_UgweR0ZAE…
G
I don’t think they deserve rights...
If you want to destroy that $10,000 robot,…
ytc_Ugz5JMaKR…
Comment
It think if AI gets good enough to write all code, then it will be AGI. Because I don't think LLMs in their current form can do it (maybe they could sort of get there with a massive context window, but I don't think they'll be 100% there even then.)
And the thing is if we get AGI then basically everyone will be replaceable. IDK I just think it's funny that people think AI (in its current form) will replace programmers, but somehow most other jobs will be fine.
youtube
2025-06-17T12:1…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugxtqn3Nksh6Z4DPgD14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzACVhNzhyBm_BBenl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXENLUxO6RWhU2dxJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyyl6GWTvHTM8hqhLp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzmbIBmU6mNTacFeRl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyB6f5npgvoSNvNMYd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxUOdXGSLpAN_jayEx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzxFykblqUd7hEgCFV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzRm8iZ-F1v0PDMmK54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAg5rgwTaVariRGXh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"})