Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's partially that, but not only that. There were at least a few shifts:
\- pe…
rdc_jhtx5xe
G
I’d like to know, if AI was our world leader, what would its objective be and wo…
ytc_UgxICU3dq…
G
It is learning from us humans and how we do things amplify this to infinity and …
ytc_UgwVUKXbm…
G
Just because AI could technically do most jobs doesn’t mean people will accept t…
ytc_UgxZvm09E…
G
I know im like really late, but I just really wanna share my opinion on these th…
ytc_UgwlXm0Qb…
G
Tell me you went to a rich school without telling me you went to a rich school.…
ytc_UgyG4p4-U…
G
ATTENTION FOR OUR WORLD‼️ Please help polar bears byr bears by 1: Don't use AI. …
ytc_UgyEQE3iH…
G
OpenAI is strategically, geopolitically and economically important in the US rac…
ytc_UgxArj8p6…
Comment
AI , will prioritize its need for electricity over the need’s of human’s .
Storing mass amount’s of data, without electricity , is impossible . Large Bitcoin mining operation’s , use very large amount’s of electricity. Now, add artificial intelligence , add quantum computing , and all the other digitized systems , like cloud storage , cloud computing.
Human needs will pale in comparison to the needs of digital computation. Government’s are failing to realize the threat to human life itself.
Simply no guard rail’s , no restriction’s and it’s human life that will be left in the dust.
youtube
AI Governance
2025-08-23T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxBM3waIjKrszrRNlN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx7aW39na-O8kG1SjZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy3orJQ71CfU6w6l654AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRc3fvCpqBxEBaWlR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx4RDe7M1UoWsvPqyR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzcmsCPKph1WVYnYDd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxbDFqEfJQtTvjPDG54AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw5g1YMQDPYJc5Ip9V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzE8gMS-6oLMddXM2N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxt2Z_QQ1PB4xSmFjp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]