Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do they need to automate everything? To stop paying you, lowly worker drone,…
ytc_UgxEviwPg…
G
You guys need to stop working for Amazon.They are ai company period they work t…
ytc_Ugw6Ca7j1…
G
Got me at 1 HR 10 when Neil is talking about "other nations" that may not want t…
ytc_UgycB2vSK…
G
AI is the best excuse a company can make at this time. It sounds like the compan…
ytc_UgxKbA7g5…
G
We appreciate your observation. Sophia is actually a robot powered by artificial…
ytr_UgwO84IDo…
G
Putting the Photoshop aside this is well on the way to being our terrifying real…
ytc_UgxYf9VAz…
G
This guest explained the dangers of AI in such a straightforward way that it bec…
ytc_UgyL41DmL…
G
You can’t just make up data sets though. In the example of using AI to predict w…
ytr_UgxJSZt2i…
Comment
We have robots. They need a power source for energy, mechanical operations to repair, and a computer program to operate, and dedicated machinery to make more robots. Flash forward 1 million years.....
We have robots. Self-sustaining energy using organic based products. Self-healing for repairs. AI free will to make decisions on their own almost instantaneously. Ability to reproduce without outside intervention. I give you.....
HUMANS
youtube
AI Governance
2023-10-31T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzNkXg5fpJpeUimq8t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwcVdy2m8VeYZQw_bd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5Jryl9H2IMOF3tbB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzF_Eq3ZEQf8CZdkb54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx5TomIZ68iXgjVldp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyNiI3xyKe5jEVQK9t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzBTl4y3-nmZ9pWYRN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgymMjrw2A6vdouKczh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwrdn4OOJnuZ5gDZWJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzGqNeYT7sqeqLTe4N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]