Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm old enough to remember bubbles back to the beginning of Personal Computers (…
ytc_UgxFg0ayW…
G
What's wrong with drawing using narrative instead of pen? AI is the future, and …
ytc_UgyyU4GfJ…
G
Ur correct, It's DATA COMPRESSION.. using AI.. result looks the same.. yeah it'…
ytc_UgwIrwTPe…
G
I hate robot humans they r creepy and I feel like they will take over the world …
ytc_UgxU0slrn…
G
I'm a complete nutjob when it comes to computers. I'll automate all the things e…
ytc_UgxtWe2Ov…
G
What a lot of people and companies also don’t understand is that if AI replaces …
ytc_UgzLZ_hjH…
G
This is a great video for intro to philosophy students! You get to watch AI be b…
ytc_Ugw3RMa1H…
G
Smarter than "us"? Almost certainly, AI would have written, Smarter than we (are…
ytc_UgxZfbXBR…
Comment
I remember Gene Roddenberry's vision of the future from Star Trek, but that future is a long, long, way off for now!
I asked Copilot to elaborate on his vision. Here's a shortened version of what was printed out:
Gene Roddenberry envisioned a future in Star Trek where humanity had moved beyond capitalism, scarcity, and materialism. In this utopian society, technology—particularly replicators—eliminated the need for money by providing everything people needed. With no struggle for survival, people were free to pursue self-improvement, exploration, and contributions to society rather than working for financial gain.
The idea was that humans would work not because they had to, but because they wanted to contribute to the greater good.
Roddenberry’s vision stands in contrast to many dystopian sci-fi worlds, offering a hopeful perspective on human potential.
youtube
2025-05-16T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxGmvs-qAx8sHcDes94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyjDh1ELQn5peJ2zX14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxm6dV9yBO6-aq-f-V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy-kjelZoXR35jhoKp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzA7DhCecn75tWWyGV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzkVw-s-NB56AwwOHR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwUbOyljrPvq0y6-th4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcFnC1azWl3c1b_N54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzQ5ODQ20TowdYVErV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw6uGezaxIlSUFFSH54AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"indifference"}
]