Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It so awful that when looking for references, alot of galleries are filled with …
ytc_UgxGYPWgC…
G
AI won't end anything because there needs to be a server / datacenter to handle …
ytc_Ugxl8J9uD…
G
Good. Transparency and accountability in technology that impacts citizens can on…
rdc_drham07
G
Ok, there is also the risk that it will usher in a golden age from self-driving …
ytc_UgwywdPFX…
G
Man continues to create his own downfall. He created the nuclear bomb and see wh…
ytc_Ugzg4dJDq…
G
Too funny, says this is all because of AI then tries to blame it all on Trump wh…
ytc_UgzpWrGQR…
G
one who is an ai engineer and cyber security expert at the same time will domina…
ytc_UgyW1bUF0…
G
Saying AI art is needed for disabled people while not being disabled is like say…
ytc_UgxUOUu9p…
Comment
I asked ChatGpt for some help with some software today at work. It listed various steps but one of them was totally wrong. I then told it that it was wrong - the software in question simply did not have the option it said it had. And then chatgpt said that I was right and then went on to tell me the correct steps which were proved right. Basically chatgpt made up the answer. I was very disappointed in the system. I also learned that chatgpt uses 30 times as much electricity to solve a problem as google does to answer a query. Although not like for like things usually, I think I’ll stick to googling for my answers now and find static answers rather than generative.
youtube
AI Moral Status
2024-09-11T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxJZ8D-IS44IsLThOZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxzIz3Wyn9MMxV0uX14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx2b5RIumTltb-cuht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy_4geapv2xRIIOuup4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy5v5rAJWXqvHx6xbF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzuYUYa_gowSoNtA-Z4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzhFhFqVU_YgUtdvxB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxsn7LwLwk1QgJVE9Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwXvMbE_IgcwKhQt9p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw3MVWmJym89wL_xSh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]