Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Haven't those tesla driverless vehicles already killed people... and this aurora…
ytc_UgwaZqVGx…
G
And you got to beware that openAI is at least "trying to save face" while compan…
ytc_UgxYnCllp…
G
Ai is just being mis used by stupidity, Ai could be used for so much good to evo…
ytc_Ugxxw_jYS…
G
Her mit deinen biometrischen Daten! Give the system politicians your personal, b…
ytc_UgzeVuYJs…
G
If something is smarter than any human. How could you control it? Once an AI is…
ytc_UgwHyAzBq…
G
A FEW YEARS AGO THEY HAD A DRIVERLESS JEEP, IT'S WAS ALL FINE UNTIL SOMEBODY FRO…
ytc_UgxxSEfPE…
G
I started in the 80s at the very beginning of offices getting automated. Want to…
rdc_oi345qp
G
@RM-xs3citechnology is always a double edged sword, but with AI the sword has a …
ytr_Ugw9NgaiI…
Comment
This video was posted on 21st May 2023, basically it's a 2.8yrs old video.
In this time frame, AI use case has evolved from answering basic questions or writing emails to developing websites, apps and products in 30 minutes.
Hence it would be interesting to know if these developments have changed the perspective of Sir Ashneer Grover.
#AIrevolution #TechnologyTransformation
#AIishere
youtube
AI Governance
2026-03-06T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwCQakA8cCxger3-zJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwI3jEFbNsTPW9V0kt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwGzqiGnIaUy0S3l1h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxsllgqIWm7sLzeu0V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx97rG3Qyu6788hAqp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzuG2X7v5E8CN9PUqN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_Nx2JhpsHHEFrOul4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwET8jzYm-gsZnn8Q54AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzBQihf5fPrJjwsa554AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw9RFtUnfxArEsFbGp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]