Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hope he sued the hell out of the department and the Facial recognition company…
ytc_UgxHLWnzW…
G
Hivolt Arc
More likely— “Hold my soy latte”
Facebook is headquartered in Silico…
ytr_Ugw8aDhDR…
G
@firstfractal515 I know how it works on a technical level. It steals nothing.…
ytr_Ugz4ZLGJ_…
G
the ones who get mad over ai are losers so the ones who relay on it, ai is just …
ytc_UgzJxH57W…
G
Well i know where my Ai investment is going towards. Since they have a backbone…
ytc_UgyyJrwrh…
G
What about the AI's that are able to take Uber other AI's anyone help me with th…
ytc_UgxxDZ5gL…
G
Ai is based. However the industrial society and its consequences have been a di…
ytc_UgweDKnrQ…
G
Everybody worries when it becomes their job on the chopping block. Remember Bide…
ytc_Ugzk_2fzu…
Comment
It's called the singularity because we're not intelligent enough to predict what will come past that point. AI has the potential to be so vastly more intelligent than humans that there's no way to tell what they might do with that intelligence. It's simply beyond even our most intelligent humans. What we need to do is augment ourselves with the intelligence capabilities of AI without giving it sentience.
youtube
AI Governance
2023-04-18T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxbKAHOwWMHYie6h7B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyzHbjn68cChDqNQ8h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyzedawhG8_lOgkb1l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxSXFXSlFlKPxGXCrx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw-nfbPQCIXek8wjnV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw36Ia2Z-e2wtdNNkN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxX1zwFnL4jW1ZBrZh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1BlHzLf5RT3xhort4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6qsnc_WWI9ec3CRZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgznJOuiw3qT3sm5fMJ4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"none","emotion":"approval"]}