Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If the government want to protect the people, just give each of us small land th…
ytc_UgwBMZmxi…
G
The REAL problem is, the willingness of it seems everyone, to turn over all thin…
ytc_UgwitfgTN…
G
The way AI is taking over, brands are lucky to have AICarma tracking their menti…
ytc_UgwBfpw2T…
G
Our only hope is to prey the AI has a better moral compass than the people in ch…
ytc_Ugx6krMrR…
G
@notthere83 In case of collision, there will be lawsuit, and only person answera…
ytr_UgwW9m4xK…
G
Your research also uses other researchers work. I'm a researcher too, and I don'…
ytr_Ugz0OcuNg…
G
@axle.student Of course not. That would be useless. Didn't say I'm planning it t…
ytr_UgxutHUOK…
G
As someone who believes in understanding, and am also a digital artist that has …
ytc_UgyX3WwoX…
Comment
The demerits from my points are: If I give examples from my academic experience, doing assignments of Uni with no or less effort, automate a task that could be done with critical thinking, doesn't that indicate that the future generation's intellectuals will be childlike if we compare their intellectual ability with present intellectuals Noam Chomsky, or Past Albert Einstein. So the future could be like, " Unlimited, unthinkable things happening at micro or macro levels with no or lesser amount of human involvement. Everything will be done by AI. Most of the people, even the most educated one will have no active brains left to do something productive. Will that be a better world or a nightmare??🤔
youtube
2024-07-13T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx4ykgfVHaePs20S_14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy4vW32ceXDHTIdr8d4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVsuAEPdb5xqHLb5J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgbHnR78lYGvGI_K54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzCmuxHBRoCRuELAQx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxcDg5-oYnYP7nToUF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz42O6l6Ng1KewgCbB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxyLE29gjEUoa5EbpF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwat8BTjyNCv1aluwJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyCeTjcrjWf1Z63x7R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]