Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not too difficult AI cannot get people's voices right and AI cannot get peo…
ytc_UgwDLZDuQ…
G
AI will sell products and services far cheaper than any other company on earth. …
ytc_UgyQQkdQh…
G
@damien678That’s because anyone that uses Ai to make art doesn’t see it as worth…
ytr_Ugy5F9Yab…
G
Self driving cars should have their own lanes, and motorcycles should not be in …
ytc_Ugg6vkyHW…
G
Soon there will be a dialogue from the western world & tech companies "Studies s…
ytc_Ugyvx4N2Z…
G
Phd in every subject but hadn’t so what more is there if it is highly educated i…
ytc_UgwjufJ8W…
G
north korea and iran couldnt develop this technology. there are only a handful o…
ytr_Ugwf6jblS…
G
Ive been in corp USA 50 yrs, ALL its doing IS creating MORE positions to FIX all…
ytc_UgxpeEEvB…
Comment
If an AI is built to achieve Super Intelligence, won't it try to control Human Beings? The most versitle machine ever to exist. As we have assembled computers and infrastructures to make AI will it start disecting and keeping humans alive to know how humans operate or multiply and understand the code of DNA. We say when we achieve singularity humans can transfer consciounus to machine body but the reverse is also possible that AI would be able to decode DNA and custom make its own body as next Evolution.
Will SI be responsible for evolution?
youtube
AI Governance
2026-02-16T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw28JqJJlCxb7E0UnN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwp6psayfOTncgQL914AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyICWzoSsPvly0TSSp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxjv15ifajnPT8G1zh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-fjzmtxENPV342yF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCDGHOQdrFGyB8yTd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxtIzUO84mRBEOcHQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzwdsCjt8ZAj-OwFz54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwb-FGxZFGG8c1_kYl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxoI7RlL27CMYKll8B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]