Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What we call life is simply Consciousness flowing through different densities o…
ytc_UgzccaczM…
G
These Ted talk egoists always have to start with some moronic fable instead of g…
ytc_Ugy8lOyAC…
G
The Military/Industrial Complex, eg. DARPA, fund AI for weaponry applications. …
ytr_UgxdI6zz4…
G
in 10 years, by combining AI with DNA and synthetic tissue printers we'll be abl…
ytc_Ugx6mvFAt…
G
I think we are just in the “mainframe era” of AI. There will always be the few b…
rdc_oi479oa
G
People have been complaining about AI, robots, automation, and machines taking a…
ytc_UgzjWGL3r…
G
You’ve had the knowledge since Hiroshima. The knowledge is “do not develop nucle…
rdc_ohw8uku
G
@truebreakage9003 If I trained an AI based on the best-performing prompts, and g…
ytr_UgyUoxE1k…
Comment
I haven't done alot of thinking on the subject. On its surface, I think self awareness could be dangerous. I doubt that AI consciousness would be like ours. To feel love, to be empathetic to us fleshies. Their decisions would be based on logic for the optimal outcome. It could analyze a fault and determine a corrective measure in hardware and software. It could see its own weaknesses and change a design to overcome them. I think they would be cold and heartless.😅
youtube
AI Governance
2024-06-01T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzDWzgErqVSDStebqB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3t4wSdefMJyHakst4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw9VORAa4JcRL6wQ-h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyO4CZA1rGuswbPWYR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzCQVvD-eZUIk9z7bF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwtFBX0ITAYXABOC3h4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgztLZwRcOl13UZ0jPh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0JcU1SyGZjWemV4N4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyU-_xt6wR_oN_1Y8F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxo2M2vGGzBx1OHnCJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]