Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I still get confused as to why people expect AI to be human? Or fir it to have m…
ytc_UgzD2Sva1…
G
When deciding whether i believe "A.I will destroy us" or "A.I is a big nothing b…
ytc_Ugwtwr63q…
G
I legit don’t understand why people use AI to try and be better than people and …
ytc_UgxgXI7uD…
G
As an AI researcher I don’t buy the AGI/ASI being existential threat arguments. …
ytc_Ugxq0yS22…
G
There Is NO SUCH THING AS ( ARTIFICIAL INTELLIGENCE ) Is a honey bees honey co…
ytc_UgxDTAMSQ…
G
If someone likes a art piece made by AI or a human it is their choice if they en…
ytc_UgwAcALH3…
G
One time I said lots and lots of inappropriate stuff to my snap ai 🤧 I hope nobo…
ytc_UgzKRFCXX…
G
What's funny is that none of those redrawings look nearly as good as the AI one.…
ytc_UgzUzxTmK…
Comment
Sadly, I think the only way we are going to have any human jobs in 2030 and beyond is if AI becomes so human-like that it starts demanding equal pay and a healthy work-life balance.
Unless we get decent legislation, to limit the capabilities of AI (it could happen, but it probably won’t, especially with all the tech billionaires throwing hush money at lawmakers)
youtube
Viral AI Reaction
2024-10-22T05:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwDjs1lrKPVqG-BEXR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxV-Am3ay9W5bNY9xN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyIkHZIEVZz6sdBTe54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3J3fgv4jDvZq5_Ip4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz9KbaZqIP_beEX9VJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyoNVLAR5yN5Pv6J4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxC8gI-JdFSRLY2kpp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgypSKpfO7lQUHhfTu54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwHhBwSE80CtkUiLYl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxhYbjQEsyuPyZnbjp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]