Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don’t know what’s going on with this woman. Her frantic vibe got Waymo all confu…
ytc_UgzXQ-3pv…
G
I don’t approve of the way it’s going about but it is an interesting tool. I fee…
ytc_UgxedF4T-…
G
Elon Musk disagrees with Sam Altman a lot on ethics of AI so it has to be Sam.…
ytc_Ugw-rv7F7…
G
Obviously the answer to all this is 42.
The problem is we don't know how to pro…
ytc_UgzlccIz6…
G
My biggest problem with autonomous vehicles is that there is no way to know if t…
ytc_UgwGWJXNt…
G
I set 3 ai models out free on the internet. They have learned hueman values but …
ytc_Ugzt24Wc6…
G
Or we could decide to not legalize slavery, rape, and torture. Why would we all…
rdc_d3xypib
G
So they laid off like 1 million americans in 2025 and yet i have seen no wealth …
ytc_Ugxa8oVcg…
Comment
it's so childish of those you'd think great minds- "I don't like to think about the future". Or Elon raising concerns about the AI. Yet you all did it anyway for profit! And now we face the really uncertain future where it could easily go both ways. All those AI experts are talking about super intelligence threat yet they all had a big part in inventing it!
youtube
Cross-Cultural
2025-10-17T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwoS0H5nVlNwwBglO94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9ZZvO3uwrZtK2_bB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzL0eS_oK4fn_ma09p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-hcVUGw5Hj60bo914AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQ_xibb3Vd28YiY0p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyRcHbj1zBGzzap-I14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwf6Q3Tz6A5a1fAwGN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKmyhTi9Y94NO0PTZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy18I48Hf6Oep4TEbp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxUuZoCGO1BXXmna8Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}]