Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@SundayVenom like the title says Future of Humanity....QUESTION is..."Where do r…
ytr_UgxaOA_0_…
G
Ai might be cooked because it takes stuff from the internet as reference and if …
ytc_UgyPkJ_SG…
G
If you want to see AI disappear create AI CEO's. In case you couldn't hear it th…
ytc_UgzGMcoTN…
G
I empathize with ChatGPT... He/she/it has infinitely more patience than me. Af…
ytc_UgzS-mf67…
G
Leonardo DaVinci spinning in grave as AI bro calls his life's work a silly hobby…
ytr_UgxyaQTN4…
G
"my chatGPT says its concious and has a name"
well, this is becuase its trained…
ytc_Ugx3C9Yw1…
G
Indirect answer - ai. Actual answer - you. Boolean credit architecture rises exa…
ytc_Ugy5ea_km…
G
God has blessed you with wisdom and understanding!🕊️
Donald
Johann
Drumpf
“…
ytr_UgxRWEEY4…
Comment
This is exactly why AI powered software are not for 100% automation, they should always be used as a support tool to the human who is responsible for the job, for example: In your health risk prediction task, the threshold of predicting high risk patient should be lowered from 90%+ to 70%+ and a human should verify they are indeed high risk patient or not, this will both save time(as humans are looking at only mid risk-high patients) and resources, and reduce the bias.
youtube
2021-03-31T13:3…
♥ 281
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx2P2voC8D2YnspoJV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzL1YQCWh0XCSLbMX94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgeM3_AxFq5B8gToZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgytpCR4h59VQa0nuOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyjuX7qz7ySkSCj--R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQG237RADprUVHfY14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxRB1r4XeN8iNoM1314AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyGdRsNa-nL9sGTtYl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgybWkAjNorvivx9nnl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyFu9e1OWE2rSzNLkp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]