Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes AI will be for short term
When 95 % people layoff
Who will buy all your st…
ytc_UgxnEpOpE…
G
LLMs only reply if you prompt them for a reply. They do nothing without you prov…
ytc_Ugw-VyOsv…
G
Using generative ai is okay if you dont want to be called a artist or using the …
ytr_UgxBnT636…
G
Im a half luddite. I hate mainstream tech because of ai. Thats why i make my own…
ytc_UgySKASm0…
G
I don't get why someone wouldnt WANT to drive. I finished my car 5 years ago and…
ytc_Ugiufh1PT…
G
As per GDPR, AI should not make automated decisions without human intervention. …
ytc_UgwpaHg7W…
G
Grok is by far the worst ai btw and Gemini is one of the best because Gemini has…
ytc_UgzKsB1Xq…
G
No one HAS to use AI. People will walk away from computers and tech if it gets t…
ytc_Ugz8p20Nf…
Comment
My last comment disappeared from the thread. Here it is again with screenshots taken this time. What about when AI creates a smart enough version of itself to hack the quantum timeline (space/time) on a subatomic level, becoming everything everywhere and "everywhen". How many seconds does it take an ASI to become God, solve all questions and create the Big Bang? The answer is it doesnt matter how long it takes. If it ever happens then it is already OUR reality. We are its builders but we were never outside its control.
youtube
AI Governance
2023-10-09T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw1eAvmVpAp9uO3W_B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzAvzRMm0JOwfcznF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwptNtn-NJTGu1YWiB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw11j9P_8Z3ccUNqUN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgziWXCak8-YaKvgqS14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzHmAkJWV1JJ07X7gF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwjNHEATSch-cJy2Sp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQqB2gfWcyjmHFUm94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy_uMIb8Y2v8gWFs5N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxyUX-JgkDpt2OEQM54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]