Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the danger isn't from an AI making a deadly virus that kills all humans, rather …
ytc_UgwzeEleq…
G
To add on to this for you guys the reason they don't release AI to the public as…
ytc_UgzsIln4w…
G
Honestly AI art sucks. When I try to generate something it doesn’t looks like in…
ytc_UgwjM6Esu…
G
The most valuable thing has always been the narrative. The holy Grail, the abili…
ytc_UgyHe-K7B…
G
Musk is a LIAR. He feeds you all bs about AI when he is the one Producing It.
I…
ytc_Ugx0CBP4n…
G
Finally desk jobs will dissappear, lazy people will have to start working now, t…
ytc_UgxDODM7A…
G
I gave ChatGPT the question formulated by ChatGPT by the interviewer about moral…
ytc_UgwK_zSJz…
G
I just write drafts and let AI fix my mistakes and then roll with it. I still do…
ytc_Ugxb9H27m…
Comment
Typical hypocrite anti-technologist. Future people will struggle at other things. Life isn't a fixed thing. If AI makes understanding passages in old texts easier, it makes space to explore harder things. We know humans can't survive on this planet alone in the future, and he wants people to be sitting there raking their brains on ancient texts. Good thing the people who invented the glasses he wears weren't traditionalists, or he would be sitting there glorifying how the hard life of a myopic person "builds character".
youtube
2024-04-28T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx3eMeWqxcHhSr6knJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyyErCGNFNZ8_4MJXJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOBQJ7y2sFv7NzSRF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyLGiUEuE-NrTSG5Lx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxRW-h3qMVqIbcZwFt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-S2I6Lbut0G4Jw5R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGNWju9ydtfTon3K14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxr-nvnd5rxJSWzbXV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCTP7d6-Tkd6DDyyB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxDYvO2q8_3yxKgdVJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]