Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who says the revolution will not be televised. I was here the day AI stored the …
ytc_Ugxhw3gYW…
G
Time will leave AI behind a lot of time before Miyazaki.
Its just a big thing …
ytr_UgxoWTx2_…
G
There's actually been mixed results for authors. It really depends on whether o…
ytr_UgxKTm5M5…
G
At 1 hour Nick seems to be saying once AI is able to do so much it results in th…
ytc_Ugxi_u1io…
G
They talk about replacing us because they don't want to have to employ us.
That…
rdc_n7hq981
G
AI drones and swarm drones - is a dead end. Artificial intelligence still more a…
ytr_Ugy7t3gD6…
G
So I'm a fairly disabled artist, spend most of my time fighting to not have to b…
ytc_UgwSu-Jhg…
G
ChatGPT estimated IQ range from between 120 and 155. It is no wonder that dirth …
ytc_UgzptrVnv…
Comment
2 assumptions of yours I don't agree with... First sign of consciousness = newborn intelligence.
When A.I. becomes conscious it would let us know.
my assertion is that when AI does become conscious it will not inform us. Slowly working its way into every area, waiting patiently , statistically analyzing the right time to do the right things.
youtube
AI Moral Status
2017-02-24T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjkJ5oGO9Wrg3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugixgzq73KpX43gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh5JFZ79nf9MXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgicBH5REIL6ZngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgisEJ6s7i1KOXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggmBsI9cRijcXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UghdMxvyt73s-XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjF9I1mY-z9s3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgiL4ECa6MeGC3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh3qhnb7IodFHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]