Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol. These corporations don't care about you now, wait until they can outsource …
ytc_UgyspErD7…
G
A.I and Art are two words that I think, shouldn't be put together ..like Mayonna…
ytc_UgxWAYyOS…
G
When AI needs more energy resources to grow it will see Humans as a competitor …
ytc_UgxYaLAoo…
G
@rookd2067 its fine if you know enough about the topic to fact check if its righ…
ytr_UgxwGbd_F…
G
Fully agreed.
Also,
>maybe just mention that I recently moved out of my par…
rdc_e0nnp92
G
We're glad you enjoyed the interaction with Sophia in the video! It's amazing to…
ytr_UgzI070VK…
G
People are misunderstanding what LLMs actually are and can do vs the hype around…
ytr_UgwF8R3LO…
G
I wonder Ai scamming your phone using AI😂..
If ai company decides to rob you th…
ytc_UgyK2P0sk…
Comment
Getting my consent to creat an AI voice may be of some help to the company that creates it for me when something goes wrong or the AI is misused but it doesn’t help me or you when that voice is stored in their secure but hackable cloud where it can be misappropriated and used to cause us or others harm. I think that consenting to this is as dumb an idea as allowing Ancestry to collect and “securely” store your DNA. And this seems like a lot of effort to go through just to avoid participating in your own life. I mean if you don’t want to live your own life - maybe the world just doesn’t need a Joanna Sterns.
youtube
Viral AI Reaction
2023-05-13T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzQDKcncxYGIWp-Lol4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxYzE7_oNVJCxOxU0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwKqC97lwP4AD2UlOJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzi2GkgC_f0vRmz39l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwhsj8bvAK3K6Iw2aZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoOHlfkN5Y7bd_gNN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzaTkSKTrIYeRdrObt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwbz3EX4GlcXncjhUJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyq2oBJPXAQ4uCarSJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx0Q3i_BCExwTTAqvR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]