Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Along with all of this being deeply unethical and tragic,
The people who use AI …
ytc_UgxbSz9bN…
G
I have usted full self driving in many cities in Mexico and it works amazingly w…
ytc_UgyESfp9T…
G
To not recognize the layers in the multi-layeredness of artificial intelligence …
ytc_UgzpNq2A0…
G
That’s a deep and important question — and one that’s being discussed seriously …
ytc_UgwuG5FPA…
G
My friend actually knows i chat with AI and we both do...lets just say...freaky …
ytc_Ugz7o3VCg…
G
Summary of the Current Aurarium State
Architectural Synthesis
Aurarium is not …
ytc_Ugwzkjfjb…
G
I look forward to watching the fallout of people’s AI data leaking in a data bre…
rdc_ks5774s
G
Fun fact about generative AI's: They actually have no memory of past conversatio…
ytc_UgyJlrTyb…
Comment
Peter F Hamilton wrote books that contain an AI. In that series of books the AI is said to have been invented by accident. The AI then taught humanity how to code limited AI's. The AI itself removed itself from reach of humanity by launching into space but stayed into contact. And I also think this is a real possibility when we create an AI: it will not see humanity as a threat, it will see humanity as uninteresting. It will make sure to be physically out of reach and start exploring the universe.
youtube
AI Governance
2026-02-19T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxG6f60ZlObElHlfIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyIc232zuEiwu86G2l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8RIZz4E92pviwkpR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwGJXSZZJjCFC8v01Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgytHqr5HIoRG7i3L3Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyfSQWnZTGDaM5z3J94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMBmgz_v8tTP2c2RV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuAxbSkPRNHLhDC3N4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz1JlfLfINJ0LgC3aF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwN-KahCxlWMYhlkd14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]