Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
True, but AI will replace paying tenants, but not need a place to rent. Supply …
ytr_UgztsmVzC…
G
I even sense some sass in the ChatGPT voice, it is unnerving when I remember wha…
ytr_UgzeQ2-sI…
G
I feel that the main difference between the music industry and art community is …
ytc_UgyuE2a9B…
G
Let's be real... she's not a robot either... she's a sexbot y'all...
C'MONNNNN.
…
ytc_UgzbpKv3n…
G
Problem isnt the future of all automated driving vehicles but how it meshes with…
rdc_nszt3yp
G
So, Ultimately the developers are working on something where we have to loose ou…
ytc_UgyMKLb_x…
G
Thats the thing with AI its forced down our throats to get normalized to use AI …
ytc_UgyJCO31R…
G
Even AI director design AI for some company also got fired by that company 😢. So…
ytc_UgxniB1gU…
Comment
You should talk to the AI model as you would a human, giving it instructions like a human, you don't have to ask it to role-play, just treat it as your best collaboration buddy ever. The reason people don't get consistent results is they simply don't know how to ask for what they want, let alone what they actually want.
Knowing the steps of how to get what you want and being able to ask for things in stages as you would any project you manage is the best way to get what you want out of AI. That's why although jobs may go in the execution phase, those of us who have built a wealth of experience on knowing the how to do something will in fact reap more rewards than ever.
youtube
AI Moral Status
2025-03-28T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx0p3nL4gVeJ_duZod4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz_CosxLY8aD7wVu894AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx7NpPeMtzEJhlJd1x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy48HKq9fahe1RsFdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxfGSmGWOjM8sWm0Pp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwtmLvi65Z2W_vgPkh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwkg8-SAcV8xKvFvZN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwl8zbcg-t1mtq1QrV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwmQvcZNEtd6OXNt7x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwphqor23a-rj-sA5B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]