Home Blog Page 3828

Home windows 10 KB5041582 replace launched with 5 adjustments and fixes

0


Home windows 10 KB5041582 replace launched with 5 adjustments and fixes

​Microsoft has launched the August 2024 preview replace for Home windows 10, model 22H2, with fixes for points inflicting system freezes and reminiscence leaks.

At the moment’s KB5041582 non-compulsory cumulative replace is a upkeep launch that permits Home windows directors to check fixes and enhancements and guarantee a extra dependable expertise for finish customers when rolling out safety updates through the necessary September 2024 Patch Tuesday replace cycle.

KB5041582 fixes reminiscence leaks impacting the Enter Technique Editor (IME) and Bluetooth units, points inflicting system freezes and stopping restarts, and additionally brings Nation and Operator Settings Asset (COSA) profiles updated for some cell operators.

You possibly can set up the replace from the Settings app by going to ‘Home windows Replace’ and clicking on ‘Verify for Updates.’ As a result of that is an non-compulsory replace, you may be requested if you wish to set up it by clicking the ‘Obtain and set up’ hyperlink.

Home windows 10 customers may manually obtain and set up KB5041582 from the Microsoft Replace Catalog.

KB5041582 preview update
Home windows KB5041582 preview replace (BleepingComputer)

​​Extra highlights in Home windows 10 KB5041582

As soon as put in, this preview launch will replace Home windows 10 22H2 programs to construct 19045.4842.

The whole checklist of fixes and adjustments on this month’s non-security preview replace contains the next:

  • [Input Method Editor (IME)] When a combo field has enter focus, a reminiscence leak may happen whenever you shut that window
  • [Country and Operator Settings Asset] This replace brings COSA profiles updated for sure cell operators.
  • [Bluetooth] An app stops responding due to a reminiscence leak in a tool.
  • [Bind Filter Driver] Your system may cease responding when it accesses symbolic hyperlinks.
  • [Unified Write Filter (UWF) and Microsoft System Center Configuration Manager (SCCM)] An SCCM job to re-enable UWF fails due to a impasse in UWF. This stops the machine from restarting whenever you count on it.

Techniques working Home windows 10 variations 2004, 20H2, 21H1, or 21H2 will be upgraded to Home windows 10 model 22H2 utilizing the KB5015684 enablement bundle. This bundle prompts inactive Home windows 10 22H2 options which are nonetheless dormant.

Microsoft says this replace is affected by two identified points stopping prospects from altering their person account profile image and inflicting Linux booting points on dual-boot programs (a workaround is accessible right here).

In early June, Microsoft reopened the Home windows 10 beta channel and introduced the primary Home windows 10 Beta construct since 2021, precisely three years after the final such construct was rolled out to Insiders within the Beta and Launch Preview channels.

One week in the past, Redmond launched the second Home windows 10 22H2 beta construct since June, with fixes for the Unified Write Filter (UWF) and Bind Filter Driver.

These Are Apple’s Oldest Merchandise Nonetheless Offered Right this moment

0


For over 20 years, the MacRumors Purchaser’s Information has served as a worthwhile useful resource for preserving observe of when Apple merchandise had been final up to date.

AirPods Max 1360 Days Old Feature 2
Under, we now have highlighted 5 current-generation Apple merchandise that haven’t obtained {hardware} upgrades in a number of years. We additionally present some buying suggestions primarily based on each rumors and our personal opinions.

This record excludes previous-generation Apple merchandise nonetheless bought, such because the iPhone 13, together with varied equipment and cables.

Professional Show XDR

Pro Display XDR BluePro Display XDR Blue
Apple launched the Professional Show XDR in December 2019, so it will likely be 5 years outdated later this yr. The high-end monitor begins at $4,999 and incorporates a 32-inch display measurement with 6K decision, nevertheless it lacks a built-in digicam and audio system.

In December 2022, Bloomberg‘s Mark Gurman stated Apple was engaged on a number of new exterior screens, together with an up to date model of the Professional Show XDR that will probably be geared up with an Apple silicon chip, identical to the Studio Show with the A13 chip. Nonetheless, it is unclear when the brand new Professional Show XDR will probably be launched or what different new options it would have.

Age: 4 years and eight months

Advice: Given how costly the Professional Show XDR is, it’s value ready for the following mannequin at this level. Additionally contemplate the Studio Show, which is over two years newer, extra reasonably priced, and has a built-in digicam and audio system.

HomePod mini

homepod mini thumb featurehomepod mini thumb feature
Launched in November 2020, the HomePod mini can be practically 4 years outdated. The smaller Siri-enabled sensible speaker has not obtained any {hardware} updates since launching, though it did get Blue, Orange, and Yellow shade choices in November 2021, and House Grey was changed with a virtually-identical Midnight end a number of months in the past.

In February 2023, Apple analyst Ming-Chi Kuo stated mass shipments of a second-generation HomePod mini would start within the second half of 2024, however it’s unclear if that data continues to be correct, as there haven’t been any latest HomePod mini rumors.

Age: 3 years and 9 months

Advice: Given there was a rumor {that a} new HomePod mini is perhaps launched later this yr, it is perhaps greatest to carry off for 3 to 4 extra months to see if that occurs. Nonetheless, given the HomePod mini is a modest $99, there’s not an excessive amount of hurt in buying one now if you happen to do not wish to wait.

AirPods Max

AirPods Max Gen 2 Feature Black 2AirPods Max Gen 2 Feature Black 2
Launched in December 2020, the AirPods Max will flip 4 years outdated later this yr. Apple’s over-ear headphones haven’t obtained any {hardware} updates since, and stay priced at $549 on Apple’s on-line retailer, with 5 shade choices out there.

Age: 3 years and eight months

Advice: We suggest ready for the AirPods Max with a USB-C port to launch later this yr, however the headphones will nonetheless kind of be 4 years outdated past that change, so it’s possible you’ll want to contemplate newer competing choices, such because the Sony XM5, Bose QuietComfort Extremely, and Sonos Ace. Amazon does have the AirPods Max on sale for $399 if you’re all in favour of buying them now at a reduction.

iPad mini

iPad mini 6 orange BGiPad mini 6 orange BG
Apple launched the present iPad mini in September 2021, with new options on the time together with a bigger 8.3-inch show, a USB-C port, a Contact ID energy button, the A15 Bionic chip, 5G assist on mobile fashions, a 12-megapixel rear digicam with Middle Stage assist, compatibility with the second-generation Apple Pencil, and extra.

In November 2023, Apple analyst Ming-Chi Kuo stated mass manufacturing of the following iPad mini would start within the second half of 2024. Extra lately, Bloomberg‘s Mark Gurman stated stock of the present iPad mini‌ was beginning to dwindle at Apple Shops, which may very well be an indication that the gadget will probably be up to date within the close to future.

Rumored options for the following iPad mini embrace a more recent chip, upgraded cameras, Wi-Fi 6E and Bluetooth 5.3 assist, new shade choices, and a repair for the so-called “jelly scrolling” impact on the present mannequin’s show.

Age: 2 years and 11 months

Advice: At this level, we recommend ready for the brand new iPad mini that’s rumored to launch later this yr.

AirPods 3

airpods 3 orangeairpods 3 orange
Launched in October 2021, the third-generation AirPods are practically three years outdated.

Two new fourth-generation AirPods fashions are anticipated to be unveiled at Apple’s particular occasion on September 9, with each choices rumored to characteristic a tweaked design with higher match within the ear, improved sound high quality, and an up to date charging case with a USB-C port. The upper-end AirPods 4 are additionally stated to characteristic energetic noise cancellation, and a speaker within the charging case that may play a sound for Discover My location monitoring.

Age: 2 years and 10 months

Advice: Don’t purchase! Apple is anticipated to announce the fourth-generation AirPods in lower than two weeks.

pgEdge Raises $10M to Advance Distributed PostgreSQL Platform

0


(monticello/Shutterstock)

pgEdge, a Virginia-based firm specializing in distributed database options primarily based on PostgreSQL, introduced $10 million in new funding. The funding shall be used to develop the corporate’s operations and additional develop its superior distributed Postgres expertise.

The funding spherical was led by Rally Ventures, with extra contributions from current traders Sands Capital Ventures and Grotech Ventures.

pgEdge claims that it’s the solely absolutely distributed PostgreSQL database that’s each open-source and fully primarily based on commonplace PostgreSQL. This allows the corporate to supply a singular answer for enterprises requiring ultra-high availability and decreased latency throughout geographically dispersed areas.

In keeping with the 2024 Stack Overflow survey, PostgreSQL continues to be the preferred database amongst builders.  That is largely as a result of PostgreSQL’s fame for reliability, scalability, and help for complicated queries and operations. 

pgEdge enhances PostgreSQL’s utility by incorporating multi-master (active-active) replication expertise. Not like conventional databases that stay centralized, pgEdge allows information to be distributed and optimized on the community edge.

Based in 2022 by Phillip Merrick and Denis Lussier, pgEdge addresses the problem of centralizing databases in a world the place different software program elements are more and more distributed. pgEdge’s mission is to offer an open-source, Postgres-based distributed platform for contemporary purposes that require swift response occasions, steady availability, and international entry. 

The pgEdge founders have intensive expertise within the startup ecosystem. Lussier is the founding father of OpenSCG, a startup acquired by AWS in 2018. Following the acquisition, he joined AWS as a Postgres product supervisor for Aurora and RDS. Merrick has a number of startup credentials, together with cofounder of Enterprise DB, coinventor of VisualCV, CEO of Fugue, and CEO of Sparkpost. 

Together with the brand new funding, pgEdge has additionally introduced the becoming a member of of Ben Fried to the corporate’s Board of Administrators. Fried is a former CIO of Google and MD of Morgan Stanley. He not too long ago joined Rally Ventures in a full-time function as a Accomplice. 

Rally Ventures is thought for its investments in early-stage enterprise expertise startups, with a selected give attention to corporations which can be creating vital new markets or introducing transformative approaches to current ones.

In his 14-year tenure at Google, Fried performed a pivotal function in overseeing the event and implementation of applied sciences that drive Google’s international enterprise operations. At Morgan Stanley, Fried designed the agency’s internet expertise platform and led software program growth infrastructure and enterprise intelligence groups.

“pgEdge is on the forefront of innovation in distributed PostgreSQL, a expertise that’s more and more essential for enterprises needing ultra-high availability and decreased latency throughout geographic areas,” stated Fried.

 “The corporate’s extremely skilled management crew, with a confirmed monitor file in scaling early-stage corporations, coupled with the corporate’s industry-leading and open distributed Postgres expertise, made this a compelling addition to our portfolio.”

Cofounder Merrick additionally shared his enthusiasm through the press launch for the brand new funding and Fried’s onboarding. He’s assured that the funding and experience will advance pgEdge’s mission. He’s assured that this funding and Fried’s experience will advance pgEdge’s mission. Merrick emphasised that pgEdge’s robust income cycle and the addition of main SaaS prospects replicate the corporate’s rising momentum out there.

The latest funding spherical for pgEdge underscores the dynamic nature of the distributed database and edge computing market. As corporations like pgEdge, Cockroach Labs, and MongoDB vie for prominence, the inflow of latest capital may very well be the enhance pgEdge must additional innovate and develop its database options.

Associated Objects

ClickHouse Acquires PeerDB to Advance Postgres CDC Integration Options

AWS Cancels Serverless Postgres Service That Scales to Zero

MariaDB Unveils Distributed SQL Imaginative and prescient at OpenWorks 2023, Boosting Scalability for MySQL and PostgreSQL Communities

AuthenticID Unveils Enhanced Good ReAuth™ for Prompt Biometric Reauthentication


PRESS RELEASE

SEATTLE, WA – August 28, 2024 – AuthenticID, a world chief in id verification and fraud prevention options, introduced the latest launch of Good ReAuth™. Good ReAuth™ is a reauthentication answer that leverages a selfie to re-confirm an id. First launched in 2021, this enhanced model gives verifications in lower than one second, with fast deployment.

 Good ReAuth™’s biometric-based method ensures quick, correct and safe id verification. Throughout enrollment, customers move by means of a full id verification transaction to confirm them as a very good, authenticated buyer. Shifting ahead, the customers can extra seamlessly show their id to be used instances like high-risk or high-dollar worth transactions, to reset a password or just log into an utility. This permits companies to confidently confirm prospects or staff at any level of their journey, whereas blocking fraudsters and account takeovers (ATO).

For companies, Good ReAuth™ gives versatile deployment choices.  It may be quickly deployed through an internet service or seamlessly embedded right into a consumer’s personal net or cell workflow. The intuitive service provider supervisor gives a user-friendly admin console and a seamless expertise for customers. Good ReAuth will be customizable primarily based on particular use instances and danger tolerance. 

Safety and privateness are paramount with the Good ReAuth™ platform. Even beforehand verified identities will be compromised by refined fraud and spoofing makes an attempt. With liveness detection and anti-fraud checks, Good ReAuth™ confirms the particular person is current and is who they declare to be. The platform additionally gives safe encryption storage: as soon as a consumer is reverified, their data is saved in a separate, secured, and encrypted database throughout the AWS cluster with customized knowledge retention insurance policies permitting immediate knowledge deletion.

As documented in each the 2024 State of Id Fraud Report and the 2024 Mid-Yr Id Fraud Overview, each id fraud makes an attempt in addition to knowledge breaches are at report highs, impacting each shoppers and companies amidst surges in fraud. Unhealthy actors are continually discovering new safety loopholes to take advantage of, permitting them entry to techniques and delicate knowledge. As a result of stopping fraud is an ongoing course of, extra safety measures like Good ReAuth™ are actually a needed software for companies to scale back fraud loss, particularly for unhealthy actors who try and entry techniques or knowledge on a number of makes an attempt.

“Companies are continually making an attempt to steadiness each safety and buyer expertise, however with the rise in refined id fraud threats, companies want the improved safety that biometrics gives” famous Alex Wong, AuthenticID VP of Product Administration. “Good ReAuth™ was developed to make sure that this answer isn’t only a highly effective software to combat fraud, but in addition improves the client expertise, giving fraud-weary prospects peace of thoughts whereas eliminating the necessity for cumbersome processes like KBA or passwords.”

Good ReAuth™ is a part of AuthenticID’s complete id verification know-how suite that provides totally automated verifications with over 99% accuracy in detecting counterfeit IDs, and 98% first-time move charges. Good ReAuth™ can be utilized in each client and workforce use instances together with however not restricted to:

  • Account takeover safety through step-up: Good ReAuth™ gives highly effective safety towards ATO makes an attempt and deepfake detection, which can happen throughout account restoration, password or account change requests, massive buy makes an attempt, misplaced/ stolen card requests, and extra.

  • Workforce Entry: Defend delicate knowledge whereas permitting staff and contractors quick access to the techniques primarily based on safety ranges set by the enterprise. 

  • Zero Belief Mannequin: Use biometric Good ReAuth™ with a zero-trust safety structure to make sure customers have the sources they want whereas preventing fraud.

AuthenticID will proceed to drive innovation ahead in its know-how to make sure corporations can keep forward of adjusting fraud methods and regulatory necessities whereas delivering best-in-class buyer expertise.

For extra data on AuthenticID’s Good ReAuth™ platform and to request a demo, go to www.authenticid.com/merchandise/smart-reauth

About AuthenticID

AuthenticID is a pioneer within the Id Verification and Fraud Detection area. AuthenticID’s fully- automated answer protects identities and mitigates fraud with out sacrificing consumer expertise. Based in 2001, AuthenticID has deep area experience in verifying government-issued identification paperwork with the market’s strongest, next-generation, automated AI platform for fraud detection and id verification. AuthenticID’s patented platform is utilized by the world’s main monetary providers companies, wi-fi carriers, and id verification platforms



Posit AI Weblog: Revisiting Keras for R



Posit AI Weblog: Revisiting Keras for R

Earlier than we even speak about new options, allow us to reply the plain query. Sure, there shall be a second version of Deep Studying for R! Reflecting what has been occurring within the meantime, the brand new version covers an prolonged set of confirmed architectures; on the identical time, you’ll discover that intermediate-to-advanced designs already current within the first version have turn out to be somewhat extra intuitive to implement, because of the brand new low-level enhancements alluded to within the abstract.

However don’t get us flawed – the scope of the ebook is totally unchanged. It’s nonetheless the right selection for individuals new to machine studying and deep studying. Ranging from the essential concepts, it systematically progresses to intermediate and superior subjects, leaving you with each a conceptual understanding and a bag of helpful software templates.

Now, what has been occurring with Keras?

State of the ecosystem

Allow us to begin with a characterization of the ecosystem, and some phrases on its historical past.

On this publish, once we say Keras, we imply R – versus Python – Keras. Now, this instantly interprets to the R bundle keras. However keras alone wouldn’t get you far. Whereas keras gives the high-level performance – neural community layers, optimizers, workflow administration, and extra – the essential information construction operated upon, tensors, lives in tensorflow. Thirdly, as quickly as you’ll must carry out less-then-trivial pre-processing, or can not maintain the entire coaching set in reminiscence due to its dimension, you’ll need to look into tfdatasets.

So it’s these three packages – tensorflow, tfdatasets, and keras – that must be understood by “Keras” within the present context. (The R-Keras ecosystem, however, is kind of a bit greater. However different packages, reminiscent of tfruns or cloudml, are extra decoupled from the core.)

Matching their tight integration, the aforementioned packages are inclined to comply with a standard launch cycle, itself depending on the underlying Python library, TensorFlow. For every of tensorflow, tfdatasets, and keras , the present CRAN model is 2.7.0, reflecting the corresponding Python model. The synchrony of versioning between the 2 Kerases, R and Python, appears to point that their fates had developed in related methods. Nothing could possibly be much less true, and realizing this may be useful.

In R, between present-from-the-outset packages tensorflow and keras, duties have all the time been distributed the best way they’re now: tensorflow offering indispensable fundamentals, however usually, remaining fully clear to the consumer; keras being the factor you employ in your code. In actual fact, it’s doable to coach a Keras mannequin with out ever consciously utilizing tensorflow.

On the Python aspect, issues have been present process vital adjustments, ones the place, in some sense, the latter growth has been inverting the primary. To start with, TensorFlow and Keras had been separate libraries, with TensorFlow offering a backend – one amongst a number of – for Keras to utilize. In some unspecified time in the future, Keras code received integrated into the TensorFlow codebase. Lastly (as of right this moment), following an prolonged interval of slight confusion, Keras received moved out once more, and has began to – once more – significantly develop in options.

It’s simply that fast progress that has created, on the R aspect, the necessity for intensive low-level refactoring and enhancements. (In fact, the user-facing new performance itself additionally needed to be carried out!)

Earlier than we get to the promised highlights, a phrase on how we take into consideration Keras.

Have your cake and eat it, too: A philosophy of (R) Keras

If you happen to’ve used Keras previously, you understand what it’s all the time been supposed to be: a high-level library, making it simple (so far as such a factor can be simple) to coach neural networks in R. Truly, it’s not nearly ease. Keras permits customers to jot down natural-feeling, idiomatic-looking code. This, to a excessive diploma, is achieved by its permitting for object composition although the pipe operator; it’s also a consequence of its considerable wrappers, comfort capabilities, and practical (stateless) semantics.

Nonetheless, as a result of method TensorFlow and Keras have developed on the Python aspect – referring to the large architectural and semantic adjustments between variations 1.x and a couple of.x, first comprehensively characterised on this weblog right here – it has turn out to be more difficult to offer the entire performance obtainable on the Python aspect to the R consumer. As well as, sustaining compatibility with a number of variations of Python TensorFlow – one thing R Keras has all the time carried out – by necessity will get increasingly more difficult, the extra wrappers and comfort capabilities you add.

So that is the place we complement the above “make it R-like and pure, the place doable” with “make it simple to port from Python, the place needed”. With the brand new low-level performance, you received’t have to attend for R wrappers to utilize Python-defined objects. As a substitute, Python objects could also be sub-classed instantly from R; and any further performance you’d like so as to add to the subclass is outlined in a Python-like syntax. What this implies, concretely, is that translating Python code to R has turn out to be so much simpler. We’ll catch a glimpse of this within the second of our three highlights.

New in Keras 2.6/7: Three highlights

Among the many many new capabilities added in Keras 2.6 and a couple of.7, we shortly introduce three of an important.

  • Pre-processing layers considerably assist to streamline the coaching workflow, integrating information manipulation and information augmentation.

  • The flexibility to subclass Python objects (already alluded to a number of instances) is the brand new low-level magic obtainable to the keras consumer and which powers many user-facing enhancements beneath.

  • Recurrent neural community (RNN) layers acquire a brand new cell-level API.

Of those, the primary two positively deserve some deeper remedy; extra detailed posts will comply with.

Pre-processing layers

Earlier than the appearance of those devoted layers, pre-processing was once carried out as a part of the tfdatasets pipeline. You’ll chain operations as required; perhaps, integrating random transformations to be utilized whereas coaching. Relying on what you needed to attain, vital programming effort could have ensued.

That is one space the place the brand new capabilities will help. Pre-processing layers exist for a number of sorts of information, permitting for the same old “information wrangling”, in addition to information augmentation and have engineering (as in, hashing categorical information, or vectorizing textual content).

The point out of textual content vectorization results in a second benefit. In contrast to, say, a random distortion, vectorization is just not one thing which may be forgotten about as soon as carried out. We don’t need to lose the unique data, specifically, the phrases. The identical occurs, for numerical information, with normalization. We have to maintain the abstract statistics. This implies there are two sorts of pre-processing layers: stateless and stateful ones. The previous are a part of the coaching course of; the latter are known as prematurely.

Stateless layers, however, can seem in two locations within the coaching workflow: as a part of the tfdatasets pipeline, or as a part of the mannequin.

That is, schematically, how the previous would look.

library(tfdatasets)
dataset <- ... # outline dataset
dataset <- dataset %>%
  dataset_map(perform(x, y) record(preprocessing_layer(x), y))

Whereas right here, the pre-processing layer is the primary in a bigger mannequin:

enter <- layer_input(form = input_shape)
output <- enter %>%
  preprocessing_layer() %>%
  rest_of_the_model()
mannequin <- keras_model(enter, output)

We’ll speak about which method is preferable when, in addition to showcase a couple of specialised layers in a future publish. Till then, please be at liberty to seek the advice of the – detailed and example-rich vignette.

Subclassing Python

Think about you needed to port a Python mannequin that made use of the next constraint:

vignette for quite a few examples, syntactic sugar, and low-level particulars.

RNN cell API

Our third level is at the very least half as a lot shout-out to glorious documentation as alert to a brand new function. The piece of documentation in query is a brand new vignette on RNNs. The vignette offers a helpful overview of how RNNs perform in Keras, addressing the same old questions that have a tendency to return up when you haven’t been utilizing them shortly: What precisely are states vs. outputs, and when does a layer return what? How do I initialize the state in an application-dependent method? What’s the distinction between stateful and stateless RNNs?

As well as, the vignette covers extra superior questions: How do I move nested information to an RNN? How do I write customized cells?

In actual fact, this latter query brings us to the brand new function we needed to name out: the brand new cell-level API. Conceptually, with RNNs, there’s all the time two issues concerned: the logic of what occurs at a single timestep; and the threading of state throughout timesteps. So-called “easy RNNs” are involved with the latter (recursion) facet solely; they have an inclination to exhibit the basic vanishing-gradients downside. Gated architectures, such because the LSTM and the GRU, have specifically been designed to keep away from these issues; each will be simply built-in right into a mannequin utilizing the respective layer_x() constructors. What if you happen to’d like, not a GRU, however one thing like a GRU (utilizing some fancy new activation methodology, say)?

With Keras 2.7, now you can create a single-timestep RNN cell (utilizing the above-described %py_class% API), and procure a recursive model – a whole layer – utilizing layer_rnn():

rnn <- layer_rnn(cell = cell)

If you happen to’re , try the vignette for an prolonged instance.

With that, we finish our information from Keras, for right this moment. Thanks for studying, and keep tuned for extra!

Picture by Hans-Jurgen Mager on Unsplash