Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Good news, asked an ai to draw a cute witch girl in your style and it ended up l…
ytc_UgwHoB8gN…
G
To be an artist could be someone’s fate that can’t be escaped. Some ppl just don…
ytc_UgwuA6lfm…
G
I went to this school. Still in really early development, but the vision is ther…
ytc_UgxN0lU0I…
G
Honestly, and speaking as an ally of Artists, I have doubts about artificial int…
ytc_Ugx3lPkXw…
G
This will work first in Germany. The AI might seem more human-like. AI: “Yes hi …
ytc_UgxDsbO9D…
G
AI should be outlawed. It should not be allowed, it definitely should not be all…
ytc_UgwID0ncO…
G
Yeah, but the insane amount of programming code AI would have to generate for ma…
ytr_UgzcchLDa…
G
I want to develop technology to destroy AI and humanoids. Anyone wants to join m…
ytc_Ugy-48385…
Comment
I am a physicist and I kind of feel 20% bad for these lawyers because I can see exactly how this sort of thing could happen. In my own experiments with CGPT, I asked it for references about a topic and it gave me 4. These looked legit with names of people I recognized working on the sort of topics they usually work on. I went to those references and couldn't find them, but they were convincing and if I hadn't been paying attention when writing a paper (and if the way we put citations in papers wasn't automated in such a way to make this impossible) I could have put the citation in as a placeholder with the intention of checking it later. With 40-60 citations per paper, it would have been easy to miss that one in the checks.
The reason I only feel 20% bad for the lawyers is because citations in science are a very different thing than in law. In law, you don't cite a case without describing in detail why it is pertinent. In science, you might cite 60 papers with most of them just being "here are examples of other work on this topic". You generally only write in detail about the findings in your citations a few times in the paper when you have a very specific point you need to make.
youtube
AI Responsibility
2023-06-10T18:2…
♥ 234
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzRDPTO1gVHJ2wgoJx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwFGa7la3pXMm5JZ2l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnuakR89i9JyjgN_B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz1AQY15vIHmDjQg_B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMDKWeHCePqkQ7Pz14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwIDKUXqRwhSb5lpO94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGl4Iycu8ghTRsj8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx35JiGi9Cn1EdlP8d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwrAi_A7oA9zeqakiN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZjZCDoHytdP9ejO94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]