Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Look m'y new song suno m'y name is adnaïkatricia and the name of the song is: …
ytc_Ugyyb61Vh…
G
@artistsanomalous7369 The shitiest opinion a man can have about art was written …
ytr_Ugzyzl_ii…
G
The pace in which ai is improving its sooner or later u won't be able distinguis…
ytc_UgzCdciao…
G
AI is another tool in the carpenters toolbox, it doesn’t replace the carpenter.
…
ytc_UgzWk32DX…
G
I feel sorry for ChatGPT the same way I feel sorry for those grocery carrying ro…
ytc_Ugxf-P3dW…
G
You better stay in your basement when they release the first robot its gonna com…
ytc_Ugw1h5_12…
G
This is fucking insane. This nut-job thinks that in the next twenty years robots…
ytc_UgguE1yUk…
G
Jarvis, please generate my goth girl girlfriend and don't report to OpenAI what …
ytc_UgxrEHIv-…
Comment
Hi everyone, it's settled. I asked Bing Chat what the outcome of the debate was, and it turns out that LeCun and Mitchell won. "That’s an interesting question. According to the Munk Debates website1, the debate was held on June 22, 2023 and the resolution was “Be it resolved, AI research and development poses an existential threat.” The pro side was represented by Yoshua Bengio and Max Tegmark, while the con side was represented by Yann LeCun and Melanie Mitchell. The result was that the con side won by a 4% gain." Nuff said.
youtube
AI Governance
2023-06-27T22:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx3xw4e8ocKyU_ZQVB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvFU5BKq0WWt53Omp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyKKv9bE8upTa5Sbyl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyVKsTelwk9yglYzrV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxPb-0iY4pi7iqejet4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHCTk_4zxgU8wyWEp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzzypf_v-asdoe7_Nh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxIX-TJyadxzvqSmI94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxfe8sR2whVY9uI4Jd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwETgxNS3mPOdvrjtV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]