Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Me who already told meta ai to kick Zuckerberg out and make me the head so that …
ytc_Ugw9Z6aRu…
G
I think we the people need to see the reality of AI and how it is going to consu…
ytc_UgzzIJxPG…
G
Oh shit i was almost to distracted i cant have any watermelon anymore to keep fo…
ytr_UgxWPKMrZ…
G
I’m siding Red suit abit on this but I know his buisiness is based on this… he c…
ytc_UgxHaZp3j…
G
yep i updated my os on the phone and all the ai trash made it slow as fuck.…
ytc_UgzLuMWtd…
G
Honestly been crying over family stuff all week and didn't get why I felt so uns…
ytc_Ugw93Cr5z…
G
I don't think it's a bad idea for A.I. to replace us for certain jobs, but there…
ytc_UgxbYcfJH…
G
If he wasnt trying to sell it or make it seem as real art this would just be pat…
ytc_Ugyr6stdd…
Comment
In my opinion I don't think AI and robots ever become a threat to earth, invade and take over the earth cause they work according to their programming, coding algorithms, and etc. Until the creator of them done a little mistake in it which results this situation just like in avengers age of ultron because of Tony Stark's little mistakes Ultron become a evil and on the other hand JARVIS is always work as Tony Stark's friend and never become bad cause there is no mistakes done by Tony Stark in creating JARVIS
youtube
AI Harm Incident
2024-08-20T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy34klxT7qaXmwDK4N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxm6vaTmk4UnB7WdUF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxxY319RuO-1-m5N8F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxanDZc647o3jZae214AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwbfxZso8wCaG5i_HR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUJRbEPUJYK8pPoHp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwh2Px6SsZ8bjP9HON4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz6n6lEiAxqfxaxUMh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwwqIvAQhB-nb3gval4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdnYtv8uFe_4wVCjd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]