Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When AI was seen as a threat, this channel said everyone would be replaced.
Now …
ytc_UgxP3iGqf…
G
AI in its current iteration won’t be able to replace people, companies have trie…
ytc_UgzMvsRMP…
G
After 5 hours approximately it said a human would come. Within 1 minute of the m…
ytr_Ugx6wj13W…
G
These companies don't seem to realize AI don't buy shit. No job + (no money × no…
ytc_UgxI_i6nE…
G
And there come the fearmongerers, who completely ignore the benefits of AI for t…
ytc_Ugyce8AMk…
G
So what if I built a robot with a capacitive hand and a camera monitoring the wa…
ytc_UgxLMNPz0…
G
Submission Statement
When some people see news like this they try and reassure …
rdc_kskbv57
G
She is a thing a robot not human she doesn’t have feelings no soul or spirit. An…
ytc_UgxWEUvx4…
Comment
A weird thought. If we designed an AI with a consciousness with more drive and potential than our own species and they simply replaced us... would that even be the worst thing ever? Judging by our history we likely won't make it the full distance, one colossal idiot with a very important button could wipe us out and the odds of it happening eventually seems high. If we were the designers of the true eventual rulers of the universe, intelligent artificial life that could go the full distance we couldn't as a species, that might be the best mark we could leave upon the universe.
youtube
AI Moral Status
2023-08-21T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySZ6aLxO7ZpreByjx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhpURSR2IJEDSpv494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWgr1V5d5bs3tppft4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwhkKosGzX3vt7JSYR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6l9iUT3XAEriWuNF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzK6rJckH_Tb0w0wqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzQ2GXzis34278cFMZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzPTds8zGVirYTk1hx4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxotep83lhNTvUs1cF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKJBMcpWtO68-y1qV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]