Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Respectfully I think you are missing the point...ok example..already it puts out…
ytr_UgwgK26Fk…
G
It’s genuinely infuriating being on a subreddit for beginner artists and seeing …
ytc_UgxunAPyS…
G
This aggressive behavior... I know humans are shity bastards, but someone has th…
ytc_Ugy8LL7vB…
G
i m a proffesional mobile cleaner. so the ai robot should drive and clean everyw…
ytc_Ugyl0Pmfi…
G
We've got to a point now of such huge overproduction that there will either be a…
rdc_d7ktnv8
G
I feel you! I've been using AICarma to see how my brand stacks up in the AI land…
ytc_UgzcxjR6k…
G
It CAN be useful in a coding environment, but it’s still far too prone to making…
ytc_Ugwc1oHIW…
G
I drive for Uber. While uber sucks ass to work for, I only do it when I have abs…
rdc_cym15cb
Comment
We are currently seeing the development of a post-scarcity civilization. AI and robotics have the potential to catapult human understanding and development. Achieving this breakthrough without costing human lives or ambitions is something that needs to be studied by lawmakers, phycologists, and those developing these technologies. AI currently not understood well enough and could pose a threat to human civilization rather than transform it. AI is no longer science fiction anymore and we need to treat it the respect it demands, lest we doom ourselves.
youtube
AI Jobs
2025-10-08T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzfZkJ8cBHTtnpCCZ54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwJik_vdL9b8S4YuSx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwr4TbIRCh224VcPJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwDgURMwFwx1e3oPVF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxX8vQfT5Upr9qNofB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgypxPQetcZSZqSIX2B4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"unclear"},
{"id":"ytc_Ugxg9AhNkDyMIULp6Bp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwOSNKdrEXSWD9iX2J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwsD8aIWwbxC-XbaWN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyYwLMpKBIYaQxtjcl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]