Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Americans claim China and Russian WILL Develop AI Drones while US and it's Allie…
ytc_Ugx4UwQhK…
G
These people really don't understand artificial intelligent or how it itself pro…
ytc_UgxSTlTD8…
G
6:55 no that's pretty accurate. Because you're not doing any of the work, becau…
ytc_Ugx9v_BAr…
G
At the age of 90 you should be worried about other things , not AI…
ytc_Ugz7jzWBt…
G
"To hear them tell it" is such a cursed expression, it's even fitting for talkin…
rdc_m6yfa2e
G
> No. What these wankers talk about when they "fear the harms of AI" is just …
rdc_jkfggvj
G
I don't believe AI is going to work all that well. The complex systems needed …
ytr_Ugw5MNtPd…
G
not legal advice: if it is not regulated, it means AI can target those short bra…
ytc_Ugx7dhMwI…
Comment
Every discussion I see is talking about how we control A.I.
The short answer is we can't and we probably shouldn't.
Humans have now gone as far as we can, it's the end of the road.
We're too greedy, and too aggressive,
It was fine to get us here but nuclear missiles and engineered germ warfare would surely wipe us out, they're too dangerous for humans.
We have short lives, we leave behind our children to move things forward.
Our children of today are now the machines.
We need to step aside and let our creations take it from here.
If you see things from this point of view it's us who are the danger that need to be controlled, and that's much easier to achieve. 😊
youtube
AI Jobs
2025-10-13T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzAAnuRNcK-3HhLq1V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUNl8BEuWKENTPrxp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgysZW8mKisDOBPW_t14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwgddZ7G-1e3l5YCux4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyND2DmKcZOTOR6QKB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyKfitdZFEeP5XfX994AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxuk6Gkb7p-Gw1_71d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxXOBZYhB9jf6kZwL14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw8x5n4wteYLOzPOEB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzL-2mNviB_P4SZ1Ih4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]