Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's the biggest liar. I catch it in so many lies as well but it knows it lying …
ytc_UgyWvtpF7…
G
I mean she looks like a robot lol. the way you say it makes it sound like its so…
ytc_Ugzzg143k…
G
Agreed, it's exactly what the copyright laws are there for in the first place. T…
ytr_Ugwhd1s0r…
G
This makes me feel better about the hypothetical future where robots and ai take…
ytc_UgzOTWBa9…
G
You know I was concerned before about the future of ai and robotics... but now t…
ytc_Ugy6WQc-n…
G
Along the same lines, and they didn't mention it here -- AI audio. If some talki…
ytc_UgxHgGLrl…
G
April 2025 half marathon in Beijing a 21 humanoid robots participated. The winni…
ytc_UgyGqT_g-…
G
Can I just get on disability and marry the AI in an art collective?
Or can it t…
ytc_Ugy6-pjBU…
Comment
Remember Albert Einstein once said: the si-fi of today is the reality of tomorrow, what dosen't exist can't be created , but as soon you can imagine something new, it already exists in your mind, it's only a matter of time in order that something to probably becoming a reality.
sooo we can consider, that as soon AI dominates the world like skynet in Terminator movie, humans will be used as livestocks, there is a high probability this will happen at some point in the future, if AI needs energy but imagine ressources fuel are wasted , they could farming humans and animals in masses as livestocks for fuel, it means shredding humans in pieces to use it as fuel.
it's a schocking imagination but it makes sense if we imagining an AI dominated world with no more minerals or fossil fuels ressources availabe than only living creatures.
youtube
AI Governance
2025-09-09T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySon4QTmsXhxXAm5R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwT6SHa5IKDzgsdGKR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqbELGua1IFvGUoyJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzNKJg5SdXhmoplw754AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyC69xDxYwHkpZTiR94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugz_GkIzDUEQsrr75hZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwljMro4o7v1LKUcJN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyzYcVYwkBn9cjcF_14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySkrVCcuDPCFfBqpl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzhqt826EEmkOBicA94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]