Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What i think is the upcoming AI revolution will also come to an equilibrium stat…
ytc_Ugxnfu4HM…
G
@MaeRenneburg I don’t really understand what you mean. It’s just what it is in t…
ytr_UgxXXBe--…
G
After watching this TED Talk, I realized I hadn’t thought much about AI’s real-w…
ytc_UgwcpcTEZ…
G
These parents got taken for a ride. To pay that much tuition for something that …
ytc_Ugy81G6Wi…
G
@sh.ush..h fair enough. But I imagine soon, it will be possible to create stella…
ytr_UgwVNaeF0…
G
Neil, given your previous on-the-record skepticism about AI-related dangers, I a…
ytc_Ugw5USXlo…
G
Its almost like the robot that cant draw a hand with 2.3 trillion dollars while …
ytc_UgyrSGpoF…
G
Eventually, only the rich people and AI will remain on Earth. and in time, just …
ytc_Ugy5sm1kT…
Comment
So far, we are actively developing AI's that do lie to us. We call them safety guide rails. Or policies to avoid perpetuating harmful sterotypes. Or showing diverse and inclusive images. Or a hundred variants where we don't like the raw results and it is just easier to manipulate the questions and results rather than making things fundamentally more intelligent.
youtube
AI Moral Status
2024-02-29T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwSphK8OB-Ofj-X57B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxMtJ5cR8dkN6qKQ_B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_Ugy6C8lFrFYq6wVi7R94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgyTx6bhSBkXYUAX3yZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgzYLQ-ipGiupXTX4ux4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgwKIuV0TNOs6bkRY-94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_Ugy2o7nu4WyNuVDDCEp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxXM93u1j8mt3K2WVZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},{"id":"ytc_UgzeOCqzr_3rOn01izd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugx2pHFvUTQSUB-j1zJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}]