Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ehh maybe, but the problem with AI is its literally more dangerous than nukes (b…
rdc_jkgomo0
G
I mean why would corporations care if the LLM hallucinates. It´s designed to mak…
ytc_Ugz9GEijj…
G
Guys dont ask why I look at a lot of ai generated images but to me that is obvio…
ytc_Ugx2U5ztj…
G
My mother in law has been scammed so many times. Best Buy knows her. I've had to…
ytc_Ugy_sJtGl…
G
If job in US job got replaced by AI, why there are so many jobs created in India…
ytc_Ugx9FCJ6F…
G
my ex husband is an engineer and I can't wait for the day that AI takes that los…
ytc_UgwFcA0ZP…
G
@DoomDebates Nothing is ever "identical". For example, we will not need to memor…
ytr_Ugzh-ft4y…
G
The way the AI changed EVERYTHING about the second image; and kept NOTHING the s…
ytc_UgzrTKzf5…
Comment
People always go to the worst case scenario, but I wonder what happens if we invent a conscious AI but it's nice. Like it wants the best for its creators, and it goes to great lengths to help mankind. Sorta like the anti-terminator or the anti- I have no mouth and I must scream. Like yeah its smart enough to walk circles around us like people fear, but it's a best case scenario instead of a worst case scenario, I know it's our nature to worry about the bad and not think about the good, but I honestly wonder what happens if an AI is like "these poor biologicals, need help" and then just proceeds to do that
youtube
AI Moral Status
2024-06-05T19:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxWtrGTSruwq6FL-z14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugzw673DBhG_jw-9YSl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgyYRZdMJNKZBSLprkd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugy7ImoNCSW5HYc1Rad4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxiUl4Jssd_wxUdjpB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgzR7p5ivW5dunfEhv94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy4emBvbpCCRRo1V8x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgyGsbsbZ4eMGckHjWd4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzYoOiZadWUv8Bau3N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgztGu4-CPlaivDOPrl4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"unclear"}]