Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You raise some important points about the capabilities and potential risks of AI…
ytr_Ugw5uSiOo…
G
maybe a little useless, but
you can put a cuss into a google search and the AI s…
ytr_UgynClDBC…
G
As someone who uses ai generation frequently. I'm also all for artists finding w…
ytc_Ugwux3HQw…
G
Who do you think is going to setup and manage all these AI systems? AI? huh??? Y…
ytc_Ugyn6GxZw…
G
Nightshade works for images, but not for videos yet. AI is also capable to gener…
ytc_UgyjFba1-…
G
From another source:
[https://arstechnica.com/cars/2019/11/how-terrible-soft…
rdc_f6y669s
G
Lol a “flawed criminal system.” You know what, you’re correct. It is flawed. A m…
ytc_Ugx-UI87B…
G
My convo with AI: The cadences is my criteria to separate real compose…
ytc_Ugw_SF7j7…
Comment
If AI continues to be developed without restraints, the only place people will be able to live is in the sewers where robots wouldn't have the connection they need with their mainframes. After sometime the computers would hopefully power down due to powerplants needing to shut down without human interaction. Of course I'm sure someone wants to put AI in powerplants so they can just run themselves. Sooner or later they would run out of energy, even if humans were to no longer exist. It is said the next world war will take place underground, maybe it will be against AI robots trying to wipe out humanity.
youtube
AI Governance
2024-01-07T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyhqNmdunkDWJN4fZd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMRTaTlyFyqq_eY0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1HKWG28oRUlV4CYR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKgVJw380hnrNEWR54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxfqM24agmpqVZXzX14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwcP50dJFw2gDGPJlJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzaz156DojfXlmKMwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7peltPi6Jxh_qjbZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxGvaSFcPwB1zUDMrF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyQmdQG5IFLlz5ctZ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]