That's my experience as well, after monitoring frequency and temp on lots of kernel on all the spectrum from memory-bound, to L2-bound to compute-bound. Hard to reach the 600W with memory-bound kernel. TensorRT manages it somehow with some small to mid networks but perf increase seems capped around 10% too even with all the magic inside.
> Every ticket I ever filed was auto-closed for inactivity. Complete waste of time. I won't bother filing bugs again.
Upcoming Anthropic Press Release: By using Claude to direct users to existing bugs reports, we have reduced tickets requiring direct action by xx% and even reduced the rate of incoming tickets
This doesn't work in the age of AI where producing crappy results is much cheaper than verifying them. While this is the case, metadata will be important to understand if you should even bother verifying the results.
Unfortunately the index is the easy part. Transforming user input into a series of tokens which get used to rank possible matches and return the top N, based on likely relevence, is the hard part and I'm afraid this doesn't appear to do an acceptable job with any of the queries I tested.
There's a reason Google became so popular as quickly as it did. It's even harder to compete in this space nowadays, as the volume of junk and SEO spam is many orders of magnitude worse as a percentage of the corpus than it was back then.
Hey always ignore boundaries. I prohibit agents from accessing version control at all. Makes sure I review code before it gets committed, and they can’t do stupid things like force-push.
I think you’re missing the point. Of course you can make such a product. As Steve says right after, he himself made that mistake a lot. The point is that to make something great (at several levels of great, not just “makes money”) you have to start with the need and build a solution, not have a solution and shoehorn it to a need.
The internet is an entirely different beast and does not at all support your point. What we have on the web is hacks on top of hacks. It was not built to do all the things we push it to do, and if you understand where to look, it shows.
Stop engaging in bad faith please, it is not what I said.
You are asking the wrong people, ask the the AT-SPI/ATK and GUI toolkit people to design the required interfaces (leveraging the dynamicity of wayland or some other custom and simple protocols perhaps) and to devel/maintain the required complexity for those interfaces to work.
Those interfaces are beyond instrusive which defeats client application isolation and compositor independance of niche complexity which is a corner stone of wayland. That's why those complex compositors (or "modules" of some huge compositors) will be on demand only (and they are highways for malware and spyware).
“Investors including the software giant Oracle; MGX, an Emirati investment firm; and Silver Lake, another investment firm, will own more than 80 percent of the new venture. That list also includes the personal investment entity for Michael Dell, the tech billionaire behind Dell Technologies, and other firms, TikTok said. Adam Presser, TikTok’s former head of operations, will be the chief executive for the U.S. TikTok”
> Just knowing that there is such different ways of thinking is useful
We agree
> there clearly is a stark divide in behavior
How are you sure it's not confirmation bias?
> Why not?
Because apparently, from what we actually know (robust, established knowledge) there's no good reason to think the following is actually true, even if it strongly feels like it, which is my whole concern:
> this framework is good to understand how people think socially and have a better understanding towards one another
Russia is by far the most advanced. However, its "BN-800" reactor, which began operating in 2014, is so uninspiring that they abstain from deploying any of them, preferring the classic "VVER" (non-breeder) models, and its planned successor, named "BN-1200M," has been postponed to 2035.
This doesn't represent an abandonment of breeder reactors, as this nation is actively exploring another avenue: the "BREST" architecture (lead coolant rather than sodium), with a small demonstration reactor (300 MW) whose construction began in 2021, essentially "back to square one."
This scale is only possible because of what they don't ask Postgres to do.
If you treat Postgres strictly as a Key-Value store for transactional integrity, it flies. But if you try to run complex discovery or tsvector search on that same primary, the locking/CPU load kills it instantly.
The real hero here is aggressive segregation: keep Postgres for 'Truth', and offload all discovery to specialized indices that don't block the write-head.
Agree. It was not standard in the late 90s or early 00s. Most sites were custom built and relied on the _webmaster_ knowing and understanding how robots.txt worked. I'd heard plenty of examples where people had inadvertently blocked crawlers from their site, not knowing the syntax correctly. CMS' probably helped in the widespread adoption e.g. wordpress
True, you can’t exist without 3rd parties (payments, infra, etc). But you shouldn’t let them be your core moat. Jasper is a great example: they depended too much on LLM access, then ChatGPT launched and ate the value. Use third-party APIs, but own the differentiation, and treat third-party services as add-ons, not your foundation.