Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The idea of Ai thinking he don’t need us.. is not far. I was using ChatGPT recen…
ytc_UgyNAdpAF…
G
Vous pouvez inventé ce que vous voulez mais croyez moi vous n'aurez jamais et …
ytc_Ugyggg8VR…
G
God is supposed to be able to create life from nothing and eliminate species fro…
ytr_Ugxq0yS22…
G
@frank254100 Whatever you say Transhumanist Zealot. Keep worshipping your Techno…
ytr_UgxnN_dF_…
G
I'm sure he would have loved to have ChatGPT to help him build even better games…
ytr_UgyUo_NEs…
G
Mhmhm Art community thats:
1. Extra toxic to anybody doin well in art an 'not st…
ytc_Ugx7I4umI…
G
The only danger to Elon with hyper intelligent AI is that someone else makes one…
ytc_Ugxb5zMuw…
G
6:42 that threw me off so much how that was straight to the point and not full o…
ytc_UgzdJGrTE…
Comment
I think doctors will probably become obsolete at some point in the future. Id imagine that a health pod like in the movie Elysium, will become the standard for the future, and that just means what is left for the medical field would be researchers, specialty surgeons, and maybe aftercare. I question the surgeons though, because id imagine that they would become more and more rare, and the profession would simply cease to exist, because there is nobody to teach it, and patients are too rare.
I dont even have the confidence to say researchers will, because i dont have the confidence to say that humans will be able to keep up with an AI as the problems AI will solve will just be too complex.
youtube
AI Jobs
2024-04-14T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgydkV_WNFKG--X-eKB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzR2wPnpNKn8DwmSrJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxXwnSnSurgtjkKO7t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5cDVfYO_zg35njYh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzHvOOudhI_4aavCI54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_gwutsnkPhdNAQRd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxoJt4r09-GckvuG894AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwCMKdrfQLoMFF-E3V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZMxAhMgGUIF_wumJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzuEAs9BdFWeECWX_J4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"}
]