Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The auto-mobile technologies are the least of people should worried about, if th…
ytc_UgxzB3iAj…
G
How about we simply don't program AI to be able of programming better AI? We als…
ytc_UgjAIMevK…
G
We appreciate your feedback. It's essential to consider different perspectives o…
ytr_UgwEZGpmQ…
G
I think AI has nothing to do with intelligence. It's just another interesting gr…
ytc_Ugyow-Kho…
G
For the AI users who want to be seen as true artists. I spent 6 (and now 7) year…
ytc_UgxXTNaZe…
G
This discussion regarding AI safety makes me laugh in its redundancies. An ant d…
ytc_Ugyqs0oNE…
G
The kids should be more concerned about their future. Remember the robot toys we…
ytc_UgyJ0HaFH…
G
Ironic, the guy who engineered AI to be sharp and mimic the human brain, is now …
ytc_Ugyu10YOd…
Comment
If highly developed AI is within the realm of possibility, wouldn't some alien civilization have created it by now? Considering the vastness of time, why haven't we encountered evidence of their existence?
Given the potential for advanced societies to influence younger ones, is the lack of intervention surprising? Does it suggest limitations in their capabilities, or perhaps a deliberate stance of non-interference?
In the vast cosmic orchestra, is humanity still tuning its instruments, or are we missing a conductor entirely? What factors might explain the silence of galactic seniors?
youtube
AI Governance
2024-01-12T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzoqyyxPSWhDemlH4p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-16mkWDBYIGvA6tZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0cVFYfUjYA3XjVw54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZh6DP5ZBJV18IknN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRbiNlhoJ4RduU_UF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxemDE3h_d17no2vRh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyW0CKZpyLqh6wvIZx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxt0BrSsxVx0zFq_ed4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9UlDs_u_u4_weh754AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzmcXbqHdexp8nj0r14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]