Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would be very interested to know if the root of the argument is the 'construct…
ytc_UgzCdHWyb…
G
The thing about AI is they generate fast but they also get stale fast, kinda lik…
ytc_UgyUS0poG…
G
the only time I've used AI to generate images is when I was testing what slop it…
ytc_UgwgOVDgN…
G
Medicine is essentially a huge spaghetti of if-then statements that doctors have…
ytc_UgxXwnSnS…
G
Turn the temperature of a model down to 0 and you'll get a better understanding …
rdc_mlhpmqk
G
In the short term the AI winner will take over ALL industry. We are setting ours…
ytc_UgypJNPTp…
G
I don't think the human race is responsible enough for AI at this point in time…
ytc_UgzzX9g_x…
G
I’ve never really fully understood the “A.I. art is stealing” argument. I haven’…
ytc_Ugx3RIUXD…
Comment
Yeah, That is true. IT programmers will rise and make money. Here’s the Issue, There will always be IT Hackers too. Just Imagine what will happen if these Robots or Automation machines get Hacked by the new Cons. That will be insane of how much these Robotic Criminals could get away more than actual Humans. So, Yeah; There’s gonna be always downfalls to relying too much on Technology. Sure, Humans getting replaced by robots is one thing. But that’s only the Beginning. If this could go to the wrong hands of a malicious Hacker, things can be Chaotic or head to another negative Direction. Humans will be even lazier by then… Yeah, We are living closer to an Era kind of like that Famous Sci-Fi movie starting with Will Smith “i-Robot” they Predicted this type of issue in that movie in 2035 AD. We are in the year 2022 AD and we are complaining that they will replace our jobs or whatnot. Whoever made that movie predicted that someday stuff like this will become an issue.
youtube
2022-08-05T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzQ4ks-yI_7HLwLU154AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxoRdCS1vg8R1fz67J4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxDQO2kkkoTiZr0fSV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1_ERDXvmmNKzG7hV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxDxVq8cbKl_MoD-eF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxiXnw1L0yxHUy8HQ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxfgRhKsDAAOmRlbe54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtWN7EARGeR3jF1ld4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQBDN7Zhxqh5M0iOd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwc4JX2e8hvUL8-QaN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]