Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I see a world where the rich develope enough automation to completely cut us out…
ytc_UgxBV5X_z…
G
"Sir Roger speaks truth—but the real shockwave is coming. While AI may not yet b…
ytc_UgwsU6mIb…
G
I'd much rather be happy with maybe half a artwork or just one instead of 15, 20…
ytc_Ugw-nakG-…
G
200 years ago in the USA 70 percent of workers were farmers. Today that number i…
ytc_UgxNPom_L…
G
This guy is the classic scholar that has little to no clue how the real world wo…
ytc_UgyBePj6p…
G
@JustDaniel6764 Yes I have, workable self driving vehicles are a long way off du…
ytr_UgxzsqrtQ…
G
Human eyes inherently have superior dynamic range when compared to even the best…
ytc_UgwNv2xqN…
G
AI images would be way more interesting if they spat out credits to the closest …
ytc_UgzEYW_fn…
Comment
10:12 Perhaps Larry said "Specious" which meant Musk was plausible but wrong in his interpretation and conclusions. Were they both High when they were conversing with each other? No need to panic about AI, there is a drug that simulates feeling and emotion that has been developed for the Supreme Being AI and it soon will be addicted to it like Opium in Rats. Then, you can take back your Humanity...or whats left of it.
youtube
AI Governance
2023-04-18T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugzjp48MU4aVTY7qgU14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgygyAcocGEuLw5bpIR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwnF99UcYHoSM-CMux4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgyWiXWVGDTw3wbDjcR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgxJiH-PJKGLw3Xed-Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwIdLO3GyQMrSTsIQF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgxzhcbsLYO_8Ac7Hp54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugy2_ImFW9MgK1NE2qZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyTz8O2Hk8MWNxqbdl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxTiA8xhvhcwTElNpl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]