Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy is a clown go listen to mo gawdat he will give you fucking nightmares h…
ytc_UgyHsiTTk…
G
All respect to Sir Hinton but he’s a victim of his own echo chamber as well. If …
ytc_UgwdTkvmj…
G
Your job will be replaced by ai. Tbus us why system ownership is important .…
ytc_Ugz3gZOat…
G
It all depends… If you come back in 50 years, then sure, AI has probably taken o…
ytr_UgxHjeo3H…
G
Now imagine if it did that in the middle of an intersection. Instead of Waymo it…
ytc_UgyJXDlse…
G
Pay attention to what Hyundai Motor Group is doing. They own Boston Dynamics tha…
ytc_Ugw21t18q…
G
Someone right now is using AI to design and build a new Ai for the purpose of bu…
ytc_UgzMiiZA9…
G
There is gonna be A LOT of LONELY FEMINISTS living in their " mini van apartment…
ytc_UgwNXjAnW…
Comment
What will really happen with AGI?
Yes – it will arrive. But we’ll also learn that Shakespeare- and Mozart-level creativity requires more than logic. It needs senses. And conscious feedback.
So what then?
Open-source AGI models like DeepSeek R1 already exist. Some developers still have a heart and plan to build nonprofit AGI. So governments may soon own AGIs – not Big Tech.
When money becomes obsolete, the state will simply provide food, homes, and care through AGI-managed robots – funded by taxes, not companies.
So no, the danger isn’t starvation.
The real risk?
• Losing meaning.
• Losing the function of money.
• And watching bad actors use AI to create viruses and chemical weapons.
This isn’t sci-fi. It’s the most likely timeline no one wants to talk about.
youtube
2025-06-14T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyUt1QlrOlD8B-S-cJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzXpg_oaUOWL0qRFEB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz1QvyzIqc1C8U5abF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNCudP3niZj_mE1KZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxOKbimEb5QtWWuvPJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz767971C75J4RQINx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAH6_bP1rIxHQjvOx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwHLfXY9UrYUEzl4rZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOPLX8FVP2R0wMsiF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzYcd-QU_DH_Uldx754AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]