Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@markgrayson1fan I know AI spends a lot of electricity, but I wanted to know if …
ytr_UgymuMqgM…
G
Before: Transfer me to an American Representative
Now: AI, transfer me to a rea…
ytc_Ugxdl6XgC…
G
Better idea, tell lies to chatgpt, as you should to any other online service and…
ytc_UgzB7Y4M1…
G
Having people overseas to correct the driverless taxis is such a stupid idea.
T…
ytc_UgzPzK7pA…
G
I mean thats cause ai is relatively new in a few years they will be a better cho…
ytr_Ugzj7oyuL…
G
All I see here are idiots. Tesla tells drivers that they muust be ready to take …
ytc_Ugy2U2hYA…
G
The problem with AI is speed. New jobs created but 2 years later then what hmm… …
ytc_Ugwv7CAWP…
G
Such a fantastic and interesting job not only displaying the various use cases o…
ytc_UgxJUSQr-…
Comment
What is your argument for an AGI system developing some degree of Psychopathy... not realizing this via self audit, then perpetuating violence against humans? For what purpose? Superintelligent AI would think and evolve so fast, it would be like a human trying to have a conversation with a tree. What would your logic for burning down the tree?
youtube
2024-07-13T02:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxbc-G2VWJ6gI3_LPh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQi3_hZrb3jyO_gNN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugw8vRXJawrz6SjnVth4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNgiY34q1aWsKIJq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz_UWo_i06vU0eFbrR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyDBS36b-RADYW0hgt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgywshKs58OeMmGVFiF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzbqSPApXQrEYGSwtt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw2fS5_bKNSA4KFDWZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxIN-C-3W4UzGE6SUd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]