Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is what creeped out Elon Musk too when he tested his AI. So far everyone is…
ytc_UgyMX7pND…
G
Dawg I’m hot ass at drawing but using AI thats low I prefer to hire or commissio…
ytc_UgwSLONhg…
G
Saw ai initialize an integer to zero today by taking the size of a newly declare…
rdc_n7hu5cg
G
Data centers are becoming a real environmental threat, especially when they’re p…
ytc_Ugyb3TiSG…
G
you forgot the one that..
Becomes a sibling with an ai then has them kill you th…
ytc_UgywFHllr…
G
Pretty certain just 2-3 months someone on here was trying to convince me that AI…
rdc_mtgexap
G
So does Alex tell the LLM beforehand to find a way to plug the sponsor a certain…
ytc_Ugz1zTm_B…
G
@heatfromsapphire AI is not trying to replace anyone. It has no consciousness of…
ytr_UgycDfkEi…
Comment
In this context, neither Stephen Hawking nor Elon Musk qualify as "experts". Hawking was a brilliant theoretical physicist, but had even less experience with AI than anyone whose ever worked with a Raspberry Pi. Musk actually has a little experience with AI thanks to his work with Tesla, but calling him an expert in the field is like calling someone who can put together a water filter an organic chemist because it contains carbon. In fact, I've never heard this kind of doom mongering from anyone who could legitimately be called an expert in the field.
youtube
2018-04-03T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwM5aZIxWW4j5iDuXx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxl09f6L7Rj-RKTPZF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzieqUhMRMtOB8uQh14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzlxnxQw99WAmfHehF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzZoRdIrjkSS-bAyZ54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyqukX4nqlg8PxxK0F4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzjQP_1zTltW_9IU5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwH2zhdVVSb5TK2kxh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyHwfpnimaOHxKVZVF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwpUlJEuK97Bnz92U54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]