Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@georgelionon9050 I get the and let's take theory of consciousness being a spect…
ytr_Ugx5vNvQD…
G
Platooning is worse that an automated truck. Trucks weigh 80,000lbs, 3 trucks we…
ytc_Ugxk0f5WN…
G
Becaus3 the data and ai companies donate a lot of money. Thats why us does this…
ytc_UgzVoKzvk…
G
Ai art is basically Art without reason, There is no actual sense of purpose of w…
ytc_UgwQ0UwsT…
G
Americans have something seriously wrong in their heads. The police escalate sit…
ytc_Ugx6OWKUK…
G
It's incredible that you've spent almost no time on how companies don't use AI b…
ytc_UgxpkZQ_g…
G
But AI can load up the truck at the warehouse while talking to the inventory man…
ytr_Ugw2T0PHR…
G
No, no, the point is that *everything anybody* gives to the AI reveals who you a…
rdc_jkstatq
Comment
I was thinking of using AI chatbot as a poditive reinforcment for me but you are explaining my problem. If i tell something stupid, AI will remember that, if OpenAI decides to put advertisments then when i say for example that my foot is itchy i will get reccomendations for foot cream elsewhere. I would like to run a chatbot locally but it might be computationally too expensive
youtube
AI Moral Status
2024-09-01T01:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwPE0Fq5ysAdVKqA2h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxLYYZhlhDt-P261l54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxcB-o_DVAGxEtp3sZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyNzMF9XuRVCKMzlCt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgycZhp_skDAtJuMPp14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxY7V_ViSbzsueC1M14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw68jFR6CzNVGTV_Qd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTiLVeuSGDVXyACM94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy8BFk4n-02rgOn6O94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzsDRKJV4dxE-ZUr9J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]