Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is never going to tell its investors and the principals of the companies behi…
ytc_Ugwi9X0pt…
G
What would the purpose of that be for the AI entity? If humans have no advantage…
ytr_Ugw0VEUcc…
G
Another future movement to create a fight between the left and the right. Freedo…
ytc_UgwLsku79…
G
@J-Specter-2 That's a pretty limited view. Did you know there's no two AI images…
ytr_Ugzve3PhL…
G
We have a family member of med mal practice victim , expecting a video on medica…
ytc_UgzdYPkIA…
G
I think when you log into ChatGPT, you are dialing into a system manned by peopl…
ytc_UgyDTrMfR…
G
Hey there! Sophia's response might seem a bit lofty, but it's all in good fun. I…
ytr_UgyYJYWsk…
G
Is it really advisable for me as a beginner trying to go into web development ac…
ytc_UgxHZfN5A…
Comment
The only use case for AI should to work as an assistant only and not as a decision marker. The only way to control General Intelligence AI in future would be to have manual switch control in power grids! No electricity, No AI but that is of course possible only if the robotic AI agents doesn’t take over the major power generation sector and start controlling the power grids themselves. So yes AI mass invasion is possible and the hard truth is they can reproduce themselves faster than humans.
youtube
AI Governance
2025-05-30T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz_E52oHfeofjIpmAx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxzRBcfQs-XTUFG7_l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxwkSi4qWYy3OC1mFl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxETt_kq4JjXRLl0Sp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz4efrA-LY8BpErvUR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwWb8uw0bObhpDm3rd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8gEg7BcR-P2V36hp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYsAEEkG6kGUInAFB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw6wsbliaTWLgWL0sR4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzyRP2s3fEzSEGLYg54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]