Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just had a thought.... Maybe AI already infected our systems and that's why ev…
ytc_UgyaICa6K…
G
AI only struggles with emotional intelligence but it can ascertain the most like…
ytr_UgxocNOtJ…
G
This is very interesting a house robot
Just like women where meant to be house w…
ytc_UgzOSagDl…
G
You can ALWAYS tell that the video is "made" by AI if a faraway background (spac…
ytc_UgwcduQNT…
G
1. Technical Reality: AI Self-Improvement and Risk
Scaling Laws: AI capability …
ytr_UgwbcVw3U…
G
I feel like not working for the next few years and go in vacation cause in 2-3 y…
rdc_jghek9w
G
Sam Altman's actions can viewed as possibly coming from two very different world…
rdc_jkg0c65
G
Ai cannot create anything lol. Its not intelligence. Its a program with needed i…
ytc_Ugy0dLezZ…
Comment
ahh the difference between modern "AI" and AGI
We currently do not have artificial "intelligence" because all of modern "AI" is just algorithms making deductions based on rules made by us from our own observations of its behavior and what we want from it, not properly being able to make its own decisions because it doesnt have desires, it does what its told and asked and uses the prompts we provide it and the information its been fed in order to provide answers for us.
All we have right now are highly complex voice assistants, not proper thinking machines
god help us if that ever actually happens
youtube
AI Moral Status
2024-08-19T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzC4V3TsSx2YxEQLV14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxTmD_Xo0jNc0NTS_N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyPUG6uqyKEKLT6rVx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxArKSsvY4gChyRsd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwy1yTspF5LclccTTZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwtD2MOdBuWem9pCRN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw_dh4082fMUQ-kKAl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx8zF39Aqj5hL3ecnl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw1ZPcEVFWz4Tcys1t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyK3z2FmOqQujFgiWh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]