Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Discovered Grok recently, someone was using it on X to give me statistics I knew…
ytc_Ugww67eRr…
G
Once Super Quantum artificial intelligence reaches peak performance, there are o…
ytc_Ugxaga64s…
G
I'd say a surgical subspecialty or psych. Psych can be replaced by AI, but peopl…
ytr_UgxB5IR5v…
G
Bro that’s just embarrassing wtf do u mean we were born with it???? There’s a re…
ytc_Ugwo-7-es…
G
China just made a nuclear battery lasting 50 years, and if *AI* robots gets weap…
ytc_UgzibYibr…
G
I’m pretty sure that anything you write in metaverse including private messages …
ytc_UgxFLnsrd…
G
The very few are going to hide while there AI and robots wipe out billions. One …
ytc_UgymvXTH5…
G
A.I. isn't your friend... The elite will always be in control of it... Their pla…
ytc_Ugz-SJT0z…
Comment
There was a BBC TV tech show called 'Click' a few years ago. They did a report on AI back in its inception. They interviewed one of the leading AI innovators and he was asked the question 'If you knew that your creation would end up destroying humanity, would you still create it?' - And it was his enthusiastic response that TRULY SHOCKED me. He simply shouted "HELL, YEAH!!" - These people need to be arrested.
youtube
AI Governance
2023-07-07T13:4…
♥ 49
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzqgxZ7HiP7x38wdZx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoTf6Hcato7N4VAo54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgynZ14iUsjUEpetFQp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwvdoFnj-XBd7WctJR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx0yiZGEn9oVy-ODTt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGQmDx56efDm0_BuB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwcdExgNRzRgwM75dd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxC6vEu4EoflxRd3Ep4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxLPQndLN1-yghPScl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_OedzcuD_IUhcngF4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"mixed"}
]