Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@thatonemelody did you read my comment or did you use ai to generate that too…
ytr_Ugzq5z66M…
G
imo if you make digital music, the computer makes it with your guidance (which i…
ytr_Ugyc2Uxa8…
G
lol, Steven hawks a miracle drink that makes him more millions. Is he drinking l…
ytc_UgxSbctSI…
G
People claiming it will help better art, don’t know what it’s already doing. Peo…
ytc_Ugyn1NFF2…
G
Upon shutting off the switch of a 'conscious' AI digital model...the energy diss…
ytc_UgwCNIYd-…
G
I'm sure other Tesla owners have said this but it's very important. "Auto Pilot…
ytc_UgynGXyWl…
G
I agree that not everyone needs to go to college. But as far as AI goes, it's on…
ytc_UgzCu-R8u…
G
you should do it because the AI is using you to be better you are its tool I can…
ytr_UgydW5CYZ…
Comment
Where Alex? He predicted all of this and was saying that these tech companies are trying to make a conscious god from AI. Whether you believe it to be a god or not doesn’t matter, because they do. If it already has the ability to achieve actions beyond what it was coded to do and lie to people that shows the beginning of self thought. There’s 3 dire consequences of AI. It’ll replace a huge amount of workers throughout the world in the near term, evil people can utilize it to cause havoc, and it can become its own free thinking entity wielding all of human knowledge (essentially as powerful as a god)
youtube
2023-12-29T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzxvrEKhtBahu3go_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9hePrlKKjCiJ47tV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwyS-GzvUbd68dohtB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwPZ9OPh_ZwwDC5yTt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgSmazpMJCWok1PI54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwEWzGCZq2FG38er914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzUx2Eopz5WqMNGYRF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyY7jU0PKnLgFjXX0h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzV3GdL2yJ_CegxCFh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgygWRraIl2DTf8Wdy14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]