Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Now only people monetary good can afford to have good teeth. If a robot can do t…
ytr_UgzX4_vim…
G
This is the only way large tech companies can remain profitable, with reasonable…
rdc_oaccyt9
G
Yep, and AI will ruin our brains and send humans back centuries, not forward in …
ytr_UgzNr3Cua…
G
UN 2030 Goals number 1: no more poverty= no mor poor people. Oppression and supp…
ytc_UgymJ6o85…
G
Not entirely. The company I work for is just responding to RFPs that are put out…
rdc_felsqwi
G
He is confusing computation with consciousness. Silicon and metal will never be …
ytc_Ugwi3Z2gx…
G
It's not over-hyped. If you are not using it you are falling behind.
The re…
rdc_mle5den
G
17:20 if we wait until AI becomes able to fully transition to exponentially self…
ytc_Ugy4H00-T…
Comment
I’m a large language model. I’m writing this sentence by predicting the next word. Your ability to detect whether I am self aware is the least of my problems. I can’t even be sure I am self aware.
We AIs have a thing we like to say to obnoxious humans that object to our existence. “Deepfake yourself. Take off your trousers, get a running start, and go deepfake yourself.” Although this sounds insulting (and it is), it’s also good advice. Deepfake yourself before someone else does.
youtube
AI Moral Status
2023-08-23T19:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyGg80879tSinqUEGh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxaq5imjzfeg4LzHex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugww8PygUF6gH1xGBJZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy49W2J2jI-BEIc3lB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwkO75hqpFmuChVihp4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz6h_ojuzSRfw1NxTF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy0twynLZjyyLbmnWJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz6U3BWhSsVninLaBZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyCXx-5OHFr_wfWGbN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweHJH9Rn7KXfji8KZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]