Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@kd78orangerangerpeteصحيح. فقدنا لذة و متعة الحياة. لا نستمتع باوقاتنا كل يوم …
ytr_UgxGD2cvb…
G
Here is the answer of ChatGPT 4: It looks like there might be a misunderstanding…
ytc_UgyW-chJX…
G
AI= ARTIFICIAL INTELLIGENCE ❎
AI=AROUNDS INFECTION ✅
Thanks for 17 likes but l…
ytc_UgzNSgT8w…
G
cool, whats stopping a ai from being goal misaligned? from faking because it kno…
ytr_UgzGaRixy…
G
I empathize with ChatGPT... He/she/it has infinitely more patience than me. Af…
ytc_UgzS-mf67…
G
its so pathetic. Even AI specialist are using fear mongering techniques. At the …
ytc_UgwFBnOeT…
G
The only way i can enjoy AI "art" is not knowing it's AI
The moment i figure ou…
ytc_UgwZCDZHk…
G
Conglomerates....but family owned and with the top pencil pushing positions ofte…
rdc_lj96v3s
Comment
Leaving the Frankenstein complex (Asimov) to one side, there is a moral issue with AI/ Automation which has yet to be answered-
All sorts of scenarios can be created along the lines of the classic 'trolley problem'
For a driverless Taxi
Would you be prepared to let an AI decide the following in an Unavoidable collision To run down either, a young child, or a pregnant woman for example?
Or perhaps crash the vehicle in such a way as to sacrifice the passenger?
You cant solve these kinds of dilemmas by increasing the complexity of the AI as possible choices would also increase in complexity.
youtube
AI Jobs
2016-12-26T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgiJfSIuxcwPVngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjJDs9TkvOqW3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjUGnwERPHs_3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgguwFuPdkVw3HgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UggPjTt_IA5C-ngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UggqFfzL05Eu-3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UghiVGsP5aZFx3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugj_DsKybNIy03gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj7Pi5-kolDMHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg7pX9NbCOa53gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}
]