Or maybe when usage is high they tweak a setting that use cache when it shouldn't.
For all we know they do whatever experiment the want, to demonstrate theoretical better margin, to analyse user patterns when a performance drop occur.
Given what is done in other industries which don't face an existential issue, it wouldn't surprise me some whistle blowers in a few years tell us what's been going on.
In any case they have an agreement with copyright holders. Anna Archive does not.
IP laws are broken, keep extending protection, and do not prevent distribution exclusivity.
I could sell a license to Bob who can sell my arts to you. But I won't give a license to Alice to even be able to enjoy the art for the fee I charged Bob, and I can tell Bob to do the same and not give it away to Alice at any cost. The law would say that's fine, and let's even arrest Bob if he ever sells to Alice.
I guess this is a naive question, but where are the lobbies that care about the people? Or even common decency at this point? It really feels like people are treating the US less as an investment and more like a sinking ship to abandon. And they were the ones that shot the holes to begin with.
There aren't any, everyone's out for themselves, further diluting any soft power the masses had. After like 99% of the population doesn't have a stock value associated, might as well join the Mobile Infantry at this point.
It isn't US specific and what goes into people's head is what propaganda wants it to be.
Economics tell people the clear impact, but it doesn't make the news. We get fed with whatever the political influencers decide to tell us. Did the brits expect the consequence of brexit? No but that's because they deemed whatever the news said to be unreliable on this topic. Which for once happened to be true.
Actually, it's pretty clear to those who have been paying attention that the main influence that swung the polls by about 4% from Remain to Leave in the last 2 days was a £10m advertising campaign on Facebook packed with lies, paid for by a Brexiteer who can't quite explain to the Electoral Commission where that money came from, but it did seem to just "appear" a few days after he allegedly met with the Russian ambassador to the UK.
I'd get out more, but my tinfoil hat doesn't like the rain.
I mean there were plenty of what would be considered intelligent people that laid out the exact consequences of brexit... problem is the masses don't seem to pay attention to well mannered intelligent people.
Not my experience. All frontier models I constantly test, agentic or not, produce code less maintainable than my (very good) peers and myself (on a decent day).
Plus they continue to introduce performance blunders.
Crying wolves, on day maybe there will be a wolf and I may be the last of us to check whether that's true.
Also because it's a large PR. Also because the maintainer has better things to do than taking longer and more energy to review than the author spent to write it, just to find that multiple optimisations will be requested, which the author may not be able to take on.
the creator of llama.cc can hardly be suspected to be reluctant or biased towards GenAI.
Absolutely -- it's perfectly understandable. I wanted to be completely upfront about AI usage and while I was willing and did start to break the PR down into parts, it's totally OK for the maintainers to reject that too.
I wanted to see if Claude Code could port the HF / MLX implementation to llama.cpp and it was successful -- in my mind that's wild!
I also learned a ton about GPU programming, how omni models work, and refined my approach to planning large projects with automated end to end integration tests.
The PR was mostly to let people know about the code and weights, since there are quite a few comments requesting support:
you got to the crux of it. Redux became a trend, surfing on its popularity at a time React wasn't providing the reactive piece it needed, plus the time machine demo just amazed everyone. The author got his job at Facebook. It carried millions of developers to use that lib, the author even said it isn't necessarily the go to mechanism, but hiring manager stuck with the idea that all projects redux magicians, since all projects needed React.
For the anecdote, I remember my manager admitting we can't fix the legacy app, but we can put lipstick on the pig with React.
The settings icon ‘sprouting’ cogs is really nice!
The editor also looks really nice.
Could this not be used online as well? Persistence on the server instead of browser cache?
(Curious what your use case is for an offline browser based editor?)
The use case is privacy. Data getting harvested by free and even paid for services isn't pleasant (targeted ads, data breach etc)
If I get to add some "server" capability it will rather be webrtc, basically P2P to sync between devices, or a config to plug our own store. E.g GitHub, Google drive, dropbox or a self hosted service to SCP the files.
It isn't just browser cached, one can export individual documents or the entire store as a zipped folder. And back that up.
if you are able to share the repo - that's great - thanks!
i am toying with an idea of having a very light weight "idea capture" solution so i can capture my raw ideas with least friction .. and then channel them into more organized projects / drafts / blogs etc later.
experimenting with tools like obsidian+git, github.dev, wispr flow, etc as input and storage channels ... but a lightweight markdown style note editor woould probably be a useful addition. need to experiment to find out for sure though.
Or maybe when usage is high they tweak a setting that use cache when it shouldn't.
For all we know they do whatever experiment the want, to demonstrate theoretical better margin, to analyse user patterns when a performance drop occur.
Given what is done in other industries which don't face an existential issue, it wouldn't surprise me some whistle blowers in a few years tell us what's been going on.
reply