Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe someone needs to start a new platform separate from YouTube. I hate, desp…
ytc_UgzM9F1C5…
G
how does chatgpt not know this. Atoms dont touch so you dont have to get to 0…
ytc_UgynMYXVk…
G
> So black people didn't reoffend at a higher rate, yet the AI still develope…
rdc_e7ipl28
G
They laid off their employees just for green report on every year just covering …
ytc_UgxCk_199…
G
what jobs should they get? that are actually accessible to them and that aren't …
ytr_UgzxF5BD_…
G
Hold on, Elon is proposing socialism where we ALL get a free life with everythin…
ytc_UgyC035gO…
G
\*Watching Biden give a speech about unifying the country, and all I could think…
rdc_gbkxn0d
G
So even te FSD can be an a-hole driver. Congrats. We have taught AI to be irrita…
ytc_UgyZUg9at…
Comment
I can remember rule based system from 20 years ago. When I worked some tech support for Logitech. I started to learn Python 4 or 5 years ago to step things up a little with models and projections. So far I've gotten Python to do some nice API processing but no great breakthroughs in AI yet. I check my enthusiasm regarding ChatGPT. It might be convenient as a research assistant tool Where I might traverse 6 website to find a code language bug solution it could cover the content on those sites faster. Not certain that it will see enough to suggest an intuitive solution now. That requires far more reasoning. AI has to be taught right now and I think we give it a bit too much credit with the wishful or agenda driven hype.
youtube
AI Governance
2024-02-16T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzXNVDFcUAsu0IC5Ah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxT3F1DRjpzMNCg9nJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxtGXPL5qOTbKzhesZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzpKR_DITG-pg0EUU14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwaFChjOx-zFkmWzNh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxan297aCmYJ--6jXp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy-bY5_4lD_Sgua4p14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPErZbKQaTt5Gmn3F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwmX9ujS4XSKX0zOiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw1RjIX7h-gE4jTMWN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]