Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Instead of AI as a substitute for human intelligence, we should be thinking of A…
ytc_Ugyp6ZXk7…
G
Why fix your product and/or service when you could spend millions on AI voice-al…
ytc_UgyrTpYE5…
G
If you record a video but you don't post it on youtube till 5 years later,can yo…
ytc_UgzxNGnv3…
G
The obvious thing to do is just build AI and Robotics to provide all needs and r…
ytc_UgzrkJ22X…
G
Terminator is coming near us. This is what scientist are doing.
Thank you for m…
ytc_UgzfZzoA4…
G
Every time I see a video like this whether it's for or against AI my curiosity i…
ytc_UgykA1fDE…
G
1 billion robots, 10 billion humans in poverty with 100,000 jobs. The robots wil…
ytc_UgwyJYVMC…
G
Artists are afraid of ai taking over the art industry. People won’t want to comm…
ytr_UgxXDQ-P6…
Comment
5-10 years is crazy optimistic. All that needs to happen is for an AI to figure out how to improve on itself and given that we see progress in AI just about every week now, that tipping point is closer than anyone realises.
It might take humans 5-10 years to evolve AI that far, but it won't take AI 5-10 years to evolve AI that far.
youtube
AI Governance
2023-07-07T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzc7T4RT_8OCO09tWx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwK9XO1PSNwetNlDFB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxlVWH1gcaogca3_fZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugw95esoRd073p7efEN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwS_mqH4q9LFFyFj7B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzc1fSwjKasZ5deA6J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzpyZWlR3lp8ncsRDN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzUOy9Aem-7RWDCO6h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwlBPehX4uvl8KTz7Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxP5F09kxaSqy9HZJt4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"mixed"}
]