Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It baffles me that Dr. Tyson is so nonchalant about the risks of runaway AI. Nuc…
ytc_UgyNiygHM…
G
time for the FCC to make them declare "AI generated" throughout the video or au…
ytc_UgwweQlqn…
G
Very interesting. The more I learn about AI the more I am interested and scared…
ytc_Ugw967ED6…
G
it will end up on best statistical chose by Ai and we all know human are not uni…
ytc_Ugw754Q5A…
G
Guys, chatgpt is kinda like calculator 2. It will do the same as what a calculat…
ytc_UgxdKo-84…
G
Gona take a long time for ai to have self awareness, that’s some crazy code lol…
ytc_Ugyc7VJdN…
G
Who else expected the robot to get out of the car and say, WTF man😂…
ytc_Ugwoi3ej2…
G
@AliceB0 but is a certain mathematical function that simulates what a human woul…
ytr_UgyppJggI…
Comment
Humans are really stupid self destructive creatures 👽.
When the first atom bomb was tested there was genuine concern that it might ignite the Earth's atmosphere & destroy everything...yet they still did it, to see what would happen!
That's the same mentality of many Darwin Award winners!
Basically we are fucked, not today, not tomorrow, but in the near future unless we grow up & be responsible!
Let's build AI soldiers & call them Terminators that'll be fun, 'cause I'm bored...
Oh, lets see what happens if we release a virus from a lab...etc.
youtube
2023-06-07T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxmxwgvt91gbC43dLt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzJ8n76OrW5CTzEjWp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyu6l1mriYMnZTzfCd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzIgB1wyrfg5G_ZQLt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw08GnI_Wd3TOyT9rN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwKB5FbrgWxlq__YOF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzRaxQe7SjJH4GboEh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGjjJntUZ-xgI_XFh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugxae63J9N8zPkilH8R4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgylB1EUP2AYf7-EK9p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]