Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When someone says, “ChatGPT is just role-playing,” what they really mean is: “It…
ytc_UgzmVWUTa…
G
Both my dentist and physical therapist use AI. I don't know how I feel about tha…
ytc_UgzV-aJRy…
G
He lived in crime ridden neighborhood and he contacted the worst criminals. It w…
ytr_UgwoH7y_8…
G
AI fanatics will never cease to amaze me with their stupidity and lack of abilit…
ytc_UgxYl1lv-…
G
One that in my opinion is organically created, not through a common reaction. Fo…
ytr_UgxtvDzUk…
G
You can tell this guy has barely any clue of where AI is going and furthermore w…
ytc_UgyxnsZIi…
G
AI software development is a straight line from prompt or instruction set to out…
ytc_Ugz1GHaqg…
G
Just did an experiment and I guessed the distance to things within my room with …
ytr_UgzFF3KTu…
Comment
Considering how the US government works, do you really think there will be any regulations?
Corporations have too much influence on our government. There are too many examples of companies putting dangerous products out on the market. Yes, eventually they're removed but usually after there are a few or more fatalities.
Companies want higher profit margins and paying people is a major cost.
They're going to want AI and robots as fast as the tech becomes usable.
Also, executives are not safe either. Low to middle-level managers will be at risk.
Execs higher up on the command structure will see them as an expense too.
Yeah, we can laugh at AI now but it will improve.
youtube
AI Jobs
2023-06-12T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw3mSi45sbuuEHocD54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwi42BPLfqbTR-_BdN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymhTCDhUpV1Ituso54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxyrF7RygQqRNlpxaF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxQ4Oipx_thQ0zNUzN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwhj8ym5FOcaihTpEh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxAafv6RAvdU4t3JeN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyVf65RV6W0gRkSK4N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy1r94E4os03uyYvQV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwUB7BMwrwPfMrCBb54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}
]