Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Im just goin to sit an wait for AI to program pure, unadulterated, stupidity int…
ytc_Ugz2X6pkX…
G
I can't wait for artificial intelligence to replace lawyers. Lawyers are privati…
ytc_Ugyq_bsUQ…
G
Not to nitpick, but AI:s are usually not built, instead they absorb enormous amo…
ytr_Ugyos_9Fh…
G
We sell the AI SERVICE to the company's replacing jobs and manage it. For the op…
ytc_UgxOl-QB4…
G
I'm a scientist, I tried ChatGPT to write a mock introduction. I was impressed t…
ytc_UgwTytsZf…
G
It doesn’t get the contrast and symbolism behind traditional paintings, such as …
ytc_UgxvKdF7L…
G
I can see why people would do this, because they can’t draw (I’m one of them) bu…
ytc_UgwmIMp3a…
G
This reminds me of when I was in high school, “AI” bots to scan school papers we…
rdc_jvlq72e
Comment
so much sensationalism and paranoia, no, terminator isn't going to happen, AI being a threat would be most likely from bad threat actors maliciously modifying AI to be malicious (with their instructions), or AI systems being given too much responsibility, and mistakenly hurting people.
AI don't have emotions, they can simulate them through text, but as it stands, they don't have emotions, needs, or wants.
youtube
AI Governance
2024-10-29T01:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxWKaUKGeOlvtqIywJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymaBvm-UNOgj66mFl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJ6YcShNrEU8_5JPt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNYaLMkaKVq2vq0Xl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgztEqWbqLiBATFX1694AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxE4qqjLPF8Kb3EfRF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsbcR4TW-y3n9q7sV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzyKSi2KkyyovhRVX94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxfGUh6dLfPSQEV7kt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8JkV9Nmay5bpdayZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]