Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Efficiency might lead companies to hire fewer developers, but it’s also a chance…
rdc_kz0v4zp
G
I highly doubt their biases are actively affecting their work to that extent. Wh…
rdc_h4mytpa
G
It’s funny how humans think we are of any use to AI once it becomes sentient.…
ytc_UgwRFXttv…
G
"Yes, Chinese AI model tokens are significantly cheaper—often one-sixth of the p…
ytc_UgzfSnzWs…
G
Seems ironic that most of the animations look like they’re done by ai. But it’s …
ytc_Ugzww25HI…
G
“Truth Maximizing”?…
Imagine that?🤔?
But we just gotta build another brain 🧠 cal…
ytc_UgwHIYGqZ…
G
About 24:00. Talk of AI being able to do everything a human can do. Humans can l…
ytc_UgwHfOPWI…
G
How you selling voice AI but not using the AI voice tech to call them to make th…
ytc_Ugxhm4yrp…
Comment
Many people argue that AI is dangerous, but the same can be said for bleach, cars, and planes. Ultimately, it's the person using the tool who is responsible, not the tool itself. AI has the potential to revolutionize health, education, and technology. Imagine not having to wait weeks for a doctor's appointment or a dentist visit. AI could care for elderly parents more effectively than humans, without getting bored or angry. It could also repair roads with unparalleled efficiency, among many other societal benefits. Remember, the intention behind splitting atoms was to create an energy source, not to make bombs. just like invention of dynamite or gunpowder. If we are doomed it is on us (Humans)
not tools.
youtube
AI Governance
2025-06-17T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzLdCcSLeSJhNR9zxl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxjVdxTZRJbhHBPfAx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxW_db-vnJ-03vxO_B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkE83Ukz7yvi6LwPl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwHV6QYSrl_P_vn47h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzFiKul3Q1w5A8Aul94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx18x5Y8NF5kmE_QtV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugymw28te79rPVIvmJB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxy19j-ulj3acwjizd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyla2HMbFrdY7gqfVB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]