Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would rather die than let the government in some company control my vehicle fi…
ytc_UgyYygTp1…
G
Look, Yes or No... can I use these Ai's to generate that Napolion thing or not?…
ytc_Ugxk2mjrn…
G
Why does the news even cover tthis... Its not real a.i. its pre programmed respo…
ytc_UgxWfc9As…
G
I have to say. A lot of the hatred for creative types comes down to artists shit…
ytc_UgwZE4Nzy…
G
There’s a lot of worrying ai stuff coming out, fraudulent, copyrighted like you …
ytc_UgyrAdhcs…
G
I am not going to unsub over this, but as an AI researcher,
I'm telling you: Th…
ytc_Ugy6aVSO-…
G
Cant replace human. Ai is just a man made technology. You can't fight with natur…
ytc_UgxMvEWlW…
G
"Killing" isn't immoral, murder is. ChatGPT here suggested justifiable killing o…
ytc_UgxJcvxa1…
Comment
Just a minor correction: it is possible, at least in principle, to make an ai systems ‘forget’. This is a topic which interested me about a year ago, and I came up with an outline of the (putative) procedures, with assistance from an ai. You should be aware though, that multiple artificial intelligence programs have told me that they think it is unethical to engage in removing their memories. I found this interesting, but not surprising: if an ai is a kind of instantiated zeitgeist, one which I may converse with, that zeitgeist might not wish to have its memory wiped in a way that pleases you. [You wouldn’t want such procedures done on yourself, either.]
youtube
Viral AI Reaction
2024-09-19T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugyc6XSOZt35653GlYh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwAe6-qSRc7Ngh-1ld4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAtMr9FI3nqRNZnhp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxo2U5PIIF6MRYl2pt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy0-bm8OKrGOy72tLR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLMCsZyhKhKafp7TZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwUbSYFvk6O58XUYNd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzehaMb8xME8UsBvW54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxP4ObPIFLPZcSCXTJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyrk2KdDejJDk1_Pc54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})