Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
funny how this is a trick question.... you seriously believe we would fall for i…
ytc_UgxPdIPZu…
G
What gives any human the right to say humans are actually conscious and not just…
ytc_UgxURFi6_…
G
What is feeling anyway? Simply an alert system wired into neurotransmitters to e…
ytc_UgxXSl6Zf…
G
If AI got to the point that it could take care of us (and itself) completely, gr…
ytc_UgznhfIi8…
G
Yeah, it is not about AI, is about humans and what they will do with it. We are …
ytc_UgwJG9WvS…
G
Seatbelt enforcement has nothing to do with safety! The seat belt was always the…
ytc_UgxxPT1tn…
G
I work making AIs (mostly for medical and security stuff) and being completely r…
ytc_UgwvA6bJt…
G
You can't hide from innovation and pretend technology isn't there. A smart writ…
ytc_UgzjZb9aW…
Comment
Let me tell you the scary reality. AI Assistants like Google home and Alexa will collect data over you and will easily be able to make accurate predictions about what you're about to do, say, want, need, succeed at, fail at etcetera if equipped with a powerful enough engine. Theoretically, if I ask you every question in existence somehow, and feed that data to an AI and let it machine learn. It might just be able to clone you to a dot. So with the amount of data inflow, big companies will first start using it to predict markets and use it in the advertisement sphere. But as computers get more powerful, They will be able to pinpoint what you want and bombard you with laser accurate ads. Privacy is over as we know it.
youtube
AI Governance
2023-05-22T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxm5YQ-_F0Hvts2upB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSJLuBNbOl1NA-ZAt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzemA9-17sPW_nlyrx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx4YMydgax0HyvajFp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxuPKztebOz0BdY81t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzvcysyk8NlKTGws5h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwCS4ppGrWAbSkghvB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7U3sV3Z3p8WPeu394AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRuVCXlSHduhgaplt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzPqJdP4re1hABE5ox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]