Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Got an ad about how you can’t spell “Anything” without “ai”
The irony is steel …
ytc_UgyZrB6gi…
G
Police still using pseudosciences like lie detectors and facial recognition. Thi…
ytc_UgwH--QgX…
G
The Luddites were, in fact, correct. Where are the professional tailors taking h…
ytc_UgzRBb9ZX…
G
Telegram isn’t the problem, if they ban telegram as a solution they will still f…
ytc_UgxRU-sbD…
G
Why are people even making this argument? I mean drawing digitally and typing up…
ytc_UgxKbGTYZ…
G
A.I. will initially partially and in some cases fully eliminate jobs and eventua…
ytc_UgyA1eskU…
G
So, ai is way smarter than humans so we are scared of it... so what? That tells …
ytc_UgzymBxr6…
G
This is blown out of proportion so much.
You could have a more thoughtful conver…
ytc_Ugy_s6RIC…
Comment
As automatic systems (including AI agents) expand execution capacity, option space, and speed, they increase available energy WHILE simultaneously eroding the conditions under which intention emerges. Acceleration collapses slack, saturates attention, and converts potential into immediate throughput, favoring continuous action over oriented action.
Within this environment, intention does not automatically scale with intelligence or capability. It appears only when coherence and slack are preserved long enough for directional selection to arise. Humans matter in this system not as superior agents, but as stewards of the conditions that allow intention to emerge at all—by introducing constraint, pause, boundary, and responsibility into otherwise self-exhausting automatic processes. Pause & Breath - then set direction. 1:50:55
youtube
2026-02-06T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzz3T9I21ZVxhHEeXt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPQRvJWM4SABBZwBF4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwBn0PGk64og1xdDhd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxIUWRBNqCPqCamR4Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3KHlE8zASzCz32Ul4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_CyL91faWwYHlkMR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzF87dt9CW0hm5Qifh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwsSrhfRrIM2M4DOBF4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzKUZw8FwvAHm3W9Y14AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz60YSWMj8zDCVupd14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]