Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem is how the AI learned to make it. It took from art on the Internet w…
ytr_Ugyktr7w6…
G
AI can't even answer a question with more than one layer of irony, waiting for i…
ytc_UgzC4xWaK…
G
Or we just don't use AI for or day to day and live free recognizing we are all b…
ytc_UgxMRBgUC…
G
I find it’s pretty common for people to place names on things. Natural forces we…
rdc_mlgse4j
G
it doesn’t seem very silly to me. a god is just a higher power: a force or being…
ytr_Ugxq0yS22…
G
The Democrats are the richest political party in history, and they were pushing …
rdc_npqmls3
G
Dictator Afwerki probably did not agree out of spite knowing they have the upper…
rdc_et7ereg
G
Not going to lie, as a network admin I deal with ATT a lot and every time I have…
ytc_UgzNq09-0…
Comment
I have always thought and believed that with the evolution of computers, that it will inevitable that computers self learning will result quickly in humans not being the most intelligent thing on this planet.
I thought this before Terminator, just think what evolution did for our intelligence from a single cell to now and then think what computers has done, especially over the last 5 years.
Once self learning truly takes hold in the design of computers, manufacturing and self learning software/algorithms. We will very soon then not be the most intelligent thing and in my understanding, this ultimately will not end well for us, as they say, ask a chicken.
youtube
AI Governance
2025-06-24T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxoBAoJWAKU0QRBJ6F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyE3HAWs7d1gKmTR5B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZrtygKxBtO9gVqop4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzPZ7kWdP5Ky2p392V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy1etlWA5UhzaJ2ka54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwu-Fi4vy1SihjkZ-h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy5sm1kT0gKiEiI6fx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxWEsGOOgkGRj2aQ5B4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzroK0Whm02mSRnrjh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGy8M1_3KFYFBX3xh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}
]