Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ah yeah, the "elon musk fears" that drove him to create the most unhinged ai of …
ytc_Ugx8fB15w…
G
Worried about AI destroying humanity? Meanwhile, my phone still can't figure out…
ytc_UgwHScQ12…
G
Ai art is not nearly good enough to do certain types of art.
Anything with a set…
ytc_Ugy1DR3j4…
G
ive seen some people argue that photography is an art form and that people were …
ytc_Ugwy3bPrU…
G
This remind me of a robot in call of duty 3 or 4 when it blows up with knifes…
ytc_UgybSCsFK…
G
A.i so dangerous people running the country and scienceist are dangerous just tu…
ytc_Ugwn8lisR…
G
Yes, there will be so much following farflung motives, but one thing is Clear: S…
ytr_UgzdkgqaU…
G
With sora2 i was able to make some realy seeming real videos about old trains an…
ytc_UgxwvF4S0…
Comment
.... 100 years? No lol..
It will be closer to 10 and when it does it will exponentially explode over the next 90 years after.
AI is one of the most incredible things man will ever acheive because it will lead to 1 of 2 outcomes.. either we gain knowledge beyond our current understandings that we cant even currently imagine.. or it leads to our extinction.
youtube
AI Responsibility
2023-10-11T11:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxWkq8cclRIh4be9y14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_UgyiLyLd2WuSkBOsQFJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxrKuqJnUOkM_XULgZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzzeg8slzaRVqZoJLt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyME5HnH6YmE2RYMe14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgzUushRMJtd9yW_BDF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz90Yx9BYqxRpsdUHF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},{"id":"ytc_UgxcjFZcBnQQwaBoV1R4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzceNV_JI4vuOx1J_R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzqr_8nE5DH_bQM-2V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]