Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I might have used ai art for my discord pfp (I swear to god if my reply section …
ytc_UgxZZ6_o6…
G
Alex forcing AI to discover consciousness.
We are all witnessing the start of an…
ytc_UgwSGhSZx…
G
A robot has all the AI tools built into it if favorite I am Elon Musk can build …
ytc_Ugyl1HMc3…
G
I mean there are populations of people that are being wiped off the map right no…
ytc_Ugwgaa9KN…
G
i dont really have a problem with ai as long as you dont sell it or sum shit cuz…
ytc_Ugy3WwzRo…
G
Well there is a better way to it it’s Hi chatGPT. Every time that i ask 'activat…
ytc_UgxJAxNah…
G
I work with a lot of lawyers at all levels from juniors, seniors associates and …
rdc_n5gjmts
G
Using AI as an assistive tool in the mdical field is absolutely mind boggling. T…
ytc_UgyGN_Fa0…
Comment
“I realized that silicon valleys technology incentive structures for producing technology were not actually leading us to develop technologies in the public interest and in fact most often it was leading to technologies that were eroding the public interest and the problems like mitigation of climate change that I was interested in were not profitable problems.”
100% true.
At least Wall Street is upfront about chasing profits and there are rules to keep it in check. Tech, on the other hand, has way less regulation, prides itself on breaking the rules, and masks its profit drive with talk about changing the world. If AI is going to impact us this much, why are we racing ahead instead of slowing down to really think about where it's all going?
youtube
Cross-Cultural
2025-07-08T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzgCjWEWn3nwxO18dZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzeSa34AzRR3BqsezV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzHrUhp2S9rgeLX9LV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwPM67J05qKOuDR7gR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwXg0wzam1BoRPjtHZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzatlZ_wj8XqhsoAAh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6G2dG9xyAy459tzB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugyo9vCddnTOdaNMmsR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5SqtzXdxwhAtVrTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwlVHlTjl7oxnMontd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]