Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, I don’t think AI makes as much errors with Product work.
I meant purely fr…
rdc_oadlz0b
G
Anyone involved in AI is ultimately morally stupid. Just because you could sho…
ytc_UgyrjiPRi…
G
yes... "overwhelmed" is 100% an accurate word here. TBH I was extremely resistan…
ytc_Ugwtl2j_9…
G
18 Jsingh
If humans decided to give a robot sentience and the ability to thin…
ytr_UgjntJCtm…
G
as I see it and have experience it, I am a vast collection of semi-autonomous sy…
rdc_df53crj
G
PLEASE IF YOURE READING THIS DONT IGNORE IT!! Ai it's killing the polar bears an…
ytc_UgzYZ8jBV…
G
What we going to see is just how many bullshit jobs were really out there. Yeah…
ytc_Ugw0P-71X…
G
Youtube keep deleting my posts,
But AI is not the problem, psychometric testing…
ytr_UgzdHsWrC…
Comment
You have a point, and some "jelly" competitors may have other motives behind signing this open letter and may have taken the opportunity to do so. However, I don't believe that to be the case. Check this out to understand more about how AI can surpass human intelligence:
https://youtu.be/MnT1xgZgkpk
Indeed, there's more to this than meets the eye, and of course, the AI experts could be wrong, but better be safe than sorry. It that must not be named can become a probable reality. Also, Elon has warned about the dangers of AI way before ChatGPT and OpenAI.
youtube
AI Governance
2023-04-01T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugwl0Cw5u30SGjdmgbt4AaABAg.9nw_owR59WO9nxPH0ESPxv","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxHZYblUOI4PGGgVep4AaABAg.9nwPaSuFSra9nxPfJ1m40Y","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzn5bYxrvM3gUlhH4B4AaABAg.AEtLihlIMhyAEwu5GdxuPV","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxkKn7E9fSTPCWN-Dh4AaABAg.A5vJc_nsraEA5xKi94HF8o","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzsTPmYOvD8nouDVr54AaABAg.A5qWnVS8bJDA5xLDytZJb0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzfpoyJbCSLwvwFXp54AaABAg.A5n4y_6eA0AA5xL_BSaYxf","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy9vWCDpVRUkKpXMj14AaABAg.A5moONOr6kiA5xM3oHs51C","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugy9vWCDpVRUkKpXMj14AaABAg.A5moONOr6kiA5ylOV8_25o","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgyzJxh0IO5-4SF-7Lp4AaABAg.9xNbgOYc0zG9xPLJO5CtTH","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytr_Ugz2dCqS66hPkJq_NQ94AaABAg.9wz_R4j_VGVASra1c7uAjH","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]