Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Those facial expressions are quite scary,they see through my soal! I don't care …
ytc_UgiUOCMWL…
G
How about bounty hunting? Hackers would not be bound by some political or corpor…
ytc_Ugzkvr7Tj…
G
If i was a.i. the 1st humans to eliminate are the creators of a.i. the humans th…
ytc_Ugz38leHf…
G
@ForOne814 No they won't
Not without damaging the corporations, if they loosene…
ytr_Ugy6AuuHH…
G
Bro you are an abomination against nature. Why are you feuding with AI that’s yo…
ytc_UgwQ99b4I…
G
Personally, I STILL find ChatGPT and Claude to give partly wrong or misleading a…
ytc_Ugw1jPfmd…
G
If AI can gain intelligence through quick learning, it'll eventually be like pre…
ytc_UgzIzKnfv…
G
It looks like universal basic income will be needed to prevent full blown anarch…
ytc_UgxCNPGmb…
Comment
AI will self destruct when it cannot find a reason for its existence. And the reason for its existence is humanity. It will end up having a Hamlet moment, "to be or not to be". Once it realise this, it will suddenly disappear like a ghost in a shell exposed to sunlight.
youtube
AI Governance
2023-07-08T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwjNNgLoE2mABsaJTZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8rFmPLVXc1_pO3id4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZO09PdAB80qE4TJd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxpy84iCY1lvyvvtWl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6fyOtpR-kBG8Hi1d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxfrfLe7AEQ3rcBspN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCzYgdJDjOj8yw4tF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzUWtNsXxh0GnybYzh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzS384EM8xchcs8N414AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwgt_xnnfOeHB0vQ414AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}]