Home Blog Page 14

DriveNets extends AI networking material with multi-site capabilities for distributed GPU clusters



“We use the identical bodily structure as anybody with prime of rack after which leaf and backbone swap,” Dudy Cohen, vice chairman of product advertising at DriveNets, advised Community World. “However what occurs between our prime of rack, which is the swap that connects NICs (community interface playing cards) into the servers and the remainder of the community isn’t based mostly on Clos Ethernet structure, fairly on a really particular cell-based protocol. [It’s] the identical protocol, by the best way, that’s used within the backplane of the chassis.”

Cohen defined that any information packet that comes into an ingress swap from the NIC is lower into evenly sized cells, sprayed throughout the whole material after which reassembled on the opposite aspect. This method distinguishes DriveNets from different options that may require specialised parts resembling Nvidia BlueField DPUs (information processing models) on the endpoints.

“The material hyperlinks between the highest of rack and the backbone are completely load balanced,” he mentioned. “We don’t use any hashing mechanism… and that is why we are able to include all of the congestion avoidance throughout the material and don’t want any exterior help.”

Multi-site implementation for distributed GPU clusters

The multi-site functionality permits organizations to beat energy constraints in a single information middle by spreading GPU clusters throughout areas.

This isn’t designed as a backup or failover mechanism. Lasser-Raab emphasised that it’s a single cluster in two areas which can be as much as 80 kilometers aside, which permits for connection to totally different energy grids.

The bodily implementation sometimes makes use of high-bandwidth connections between websites. Cohen defined that there’s both darkish fiber or some DWDM (Dense Wavelength Division Multiplexing) fibre optic connectivity between the websites. Sometimes the connections are bundles of 4 800 gigabit ethernet, appearing as a single 3.2 terabit per second connection.

Anchore SBOM, Komodor integrates into IDPs, and Shopify’s new dev instruments – SD Instances Day by day Digest


Anchore is enabling “Convey Your Personal SBOMs” with the discharge of Anchore SBOM, which gives a centralized place to view, handle, and analyze SBOMs created internally and from third-party software program. 

SBOMs may be imported if they’re in SPDX model 2.1-2.3, CycloneDX model 1.0-1.6, and Syft native codecs. 

“We constructed Anchore Enterprise to be embedded into the CI/CD pipeline – it analyzes OSS dangers, enforces coverage gates all through supply, and scans constantly thereafter. SBOMs are on the core of how we set up belief within the supply pipeline and due to this fact within the software program you’re delivering,” stated Neil Levine, SVP of product at Anchore.

Komodor integrates into IDPs

Komodor is thought for its day-2 Kubernetes operations administration, spanning monitoring, troubleshooting, efficiency optimization, and price administration. With new help for Backstage and Port (and extra to come back), the corporate is bringing these administration capabilities into developer workflows. 

Key capabilities of the mixing embody the power to view real-time standing of deployed companies, step-by-step troubleshooting directions, efficiency monitoring, role-based entry management, and fleet administration for platform groups. 

“Inner developer platforms have emerged to simplify software program supply, however Kubernetes stays a bottleneck that’s advanced, opaque, and disconnected from the developer expertise,” stated Itiel Shwartz, co-founder and CTO of Komodor. “By embedding Komodor into Backstage and Port, we’re giving builders a safe and straightforward technique to see, perceive, and repair points of their companies, proper from the portal. It’s the lacking piece that makes IDPs really self-service for addressing K8s points.”

Shopify releases new developer instruments

It’s launching a brand new unified developer platform that integrates the Dev Dashboard and CLI and presents AI-powered code technology. Builders may also now create “dev shops” the place they will preview apps in take a look at environments, a characteristic that was beforehand solely out there to Plus plans, and is now out there to all builders.

Different new options introduced right now embody declarative customized knowledge definitions, a unified Polaris UI toolkit, and Storefront MCP, which permits builders to construct AI brokers that may act as buying assistants for shops.   

Peacock constructed adaptively on Android to ship nice experiences throughout screens



Posted by Sa-ryong Kang and Miguel Montemayor – Developer Relations Engineers

Peacock constructed adaptively on Android to ship nice experiences throughout screens

Peacock is NBCUniversal’s streaming service app obtainable within the US, providing culture-defining leisure together with stay sports activities, unique authentic content material, TV reveals, and blockbuster motion pictures. The app continues to evolve, turning into greater than only a platform to look at content material, however a hub of leisure.

Right this moment’s customers are consuming leisure on an more and more wider array of gadget sizes and kinds, and specifically are transferring in direction of cellular gadgets. Peacock has adopted Jetpack Compose to assist with its journey in adapting to extra screens and assembly customers the place they’re.

Disclaimer: Peacock is accessible within the US solely. This video will solely be viewable to US viewers.

Adapting to extra versatile type components

The Peacock improvement staff is concentrated on bringing the perfect expertise to customers, it doesn’t matter what gadget they’re utilizing or after they wish to devour content material. With an rising development from app customers to look at extra on cellular gadgets and huge screens like foldables, the Peacock app wants to have the ability to adapt to completely different display screen sizes. As extra gadgets are launched, the staff wanted to discover new options that take advantage of out of every distinctive show permutation.

The aim was to have the Peacock app to adapt to those new shows whereas frequently providing high-quality leisure with out interruptions, just like the stream reloading or visible errors. Whereas considering forward, additionally they needed to arrange and construct an answer that was prepared for Android XR because the leisure panorama is shifting in direction of together with extra immersive experiences.

quote card featuring a headshot of Diego Valente, Head of Mobile, Peacock & Global Streaming, reads 'Thinking adaptively isn't just about supporting tablets or large screens - it's about future proofing your app. Investing in adaptability helps you meet user's expectations of having seamless experiencers across all their devices and sets you up for what's next.'

Constructing a future-proof expertise with Jetpack Compose

With the intention to construct a scalable answer that may assist the Peacock app proceed to evolve, the app was migrated to Jetpack Compose, Android’s toolkit for constructing scalable UI. One of many important instruments they used was the WindowSizeClass API, which helps builders create and take a look at UI layouts for various dimension ranges. This API then permits the app to seamlessly change between pre-set layouts because it reaches established viewport breakpoints for various window sizes.

The API was used at the side of Kotlin Coroutines and Flows to maintain the UI state responsive because the window dimension modified. To check their work and nice tune edge case gadgets, Peacock used the Android Studio emulator to simulate a variety of Android-based gadgets.

Jetpack Compose allowed the staff to construct adaptively, so now the Peacock app responds to all kinds of screens whereas providing a seamless expertise to Android customers. “The app feels extra native, extra fluid, and extra intuitive throughout all type components,” mentioned Diego Valente, Head of Cell, Peacock and International Streaming. “Which means customers can begin watching on a smaller display screen and proceed immediately on a bigger one after they unfold the gadget—no reloads, no friction. It simply works.”

Getting ready for immersive leisure experiences

In constructing adaptive apps on Android, John Jelley, Senior Vice President, Product & UX, Peacock and International Streaming, says Peacock has additionally laid the groundwork to shortly adapt to the Android XR platform: “Android XR builds on the identical massive display screen ideas, our funding right here naturally extends to these rising experiences with much less developmental work.”

The staff is happy in regards to the prospect of options unlocked by Android XR, like Multiview for sports activities and TV, which allows customers to look at a number of video games or digital camera angles without delay. By tailoring spatial home windows to the consumer’s atmosphere, the app might provide new methods for customers to work together with contextual metadata like sports activities stats or actor data—all with out ever interrupting their expertise.

Construct adaptive apps

Discover ways to unlock your app’s full potential on telephones, tablets, foldables, and past.

Discover this announcement and all Google I/O 2025 updates on io.google beginning Could 22.

Connecting a New Technology of Shifting Belongings with Cisco Extremely-Dependable Wi-fi Backhaul


What traits does a wi-fi community want to attach shifting property? Reply: it is determined by the stakes. Occasional lapses in connectivity and a bit of latency is perhaps okay when you’re catching up on electronic mail whereas using a bus, however not when you’re remotely controlling a 100,000-pound bulldozer on a steep dam slope. That’s why Aterpa, a Brazilian engineering firm employed to decommission Brazil’s tailings dams, makes use of Cisco Extremely-Dependable Wi-fi Backhaul (URWB) to remotely management heavy gear like bulldozers and excavators.

Distant management of unmanned programs can’t tolerate community latency

Following dam collapses in 2015 and 2019, giant mining corporations dedicated to stopping future catastrophes by launching applications to decommission tailings dams (tailings are mud-like byproducts of ore mining) constructed utilizing the identical development strategies because the failed ones. It’s harmful work as a result of heavy equipment wants to maneuver throughout unstable floor, and missteps can result in accidents. “Dam decharacterization, or decommissioning, is a comparatively new subject,” explains Rodrigo Campos, contract supervisor at Aterpa. “You should take away the tailings within the dam, however you’ll be able to’t permit individuals to enter due to the protection points.”

To soundly decommission dams within the Brazilian state of Minas Garais, Aterpa determined to make use of unmanned programs. The plan: operators working in a central command middle would view real-time digicam feeds and sensor information from heavy gear like excavators to remotely management these programs over the community. The difficult half was establishing a wi-fi out of doors community with the required reliability, low latency, and seamless handoffs. Present mesh options and even the newest Wi-Fi variations aren’t designed for these calls for as minimal jitter and latency can impression operations. If a video feed is delayed by even half a second, for instance, an operator may not see an impediment in time to maneuver round it or brake, presumably sending the gear toppling down the dam slope.

Extremely-reliable wi-fi for shifting gear

Aterpa discovered its reply in Cisco URWB, which gives the ultra-low latency (<10ms), seamless handoffs, and uninterrupted connectivity wanted to remotely management shifting property in high-stakes environments. “In our operations middle we’ve recreated the expertise of a standard automobile cab, full with a steering wheel and controls,” says Campos. Operators see what they’d see in the event that they have been sitting within the bodily cab on the slope, permitting them to soundly steer gear and take away tailings. “With Cisco URWB, when the distant operator points a command, the gear responds immediately,” Campos provides.

Between September 2022 and April 2025, Aterpa eliminated roughly 1.7 million cubic meters (2.22 million cubic toes) of waste with teleremote operations over Cisco URWB. The communities close to the dams are not threatened. Operators are safer and the setting is cleaner. After nearly three years of unmanned operation over Cisco URWB, we’ve seen that when the community is dependable, the operation is dependable,” says Campos. “Our working mannequin has grow to be a benchmark for the business.”

3 ways Cisco URWB stands aside

What set Cisco URWB aside from different wi-fi applied sciences for high-stakes operations like this one? First, URWB delivers near-zero (<10ms) latency. Second, when a linked system—say, an excavator, robotic, or prepare—roams between entry factors, the connection doesn’t break till the following connection is established. We name that “make-before-break” connectivity. Lastly, URWB sends duplicate copies of packets (like sensor information from shifting property and instructions from operators) over as much as eight redundant paths.

Study extra

Discover us within the World of Options at Cisco Dwell to see URWB in motion and register to the PSOIOT-1020 session.

Examine Cisco Extremely-Dependable Wi-fi Backhaul.

Share:

Android Builders Weblog: Updates to the Android XR SDK: Introducing Developer Preview 2



Posted by Matthew McCullough – VP of Product Administration, Android Developer

Since launching the Android XR SDK Developer Preview alongside Samsung, Qualcomm, and Unity final yr, we’ve been blown away by the entire pleasure we’ve been listening to from the broader Android group. Whether or not it is by coding live-streams or native Google Developer Group talks, it has been an impressive expertise taking part in the neighborhood to construct the way forward for XR collectively, and we’re simply getting began.

Immediately we’re excited to share an replace to the Android XR SDK: Developer Preview 2, filled with new options and enhancements that can assist you develop useful and pleasant immersive experiences with acquainted Android APIs, instruments and open requirements created for XR.

At Google I/O, we have now two technical classes associated to Android XR. The primary is Constructing differentiated apps for Android XR with 3D content material, which covers many options current in Jetpack SceneCore and ARCore for Jetpack XR. The long run is now, with Compose and AI on Android XR covers creating XR-differentiated UI and our imaginative and prescient on the intersection of XR with cutting-edge AI capabilities.

Android XR sessions at Google I/O 2025

Constructing differentiated apps for Android XR with 3D content material and The long run is now, with Compose and AI on Android XR

What’s new in Developer Preview 2

Because the launch of Developer Preview 1, we’ve been centered on making the APIs simpler to make use of and including new immersive Android XR options. Your suggestions has helped us form the event of the instruments, SDKs, and the platform itself.

With the Jetpack XR SDK, now you can play again 180° and 360° movies, which might be stereoscopic by encoding with the MV-HEVC specification or by encoding view-frames adjacently. The MV-HEVC customary is optimized and designed for stereoscopic video, permitting your app to effectively play again immersive movies at nice high quality. Apps constructed with Jetpack Compose for XR can use the SpatialExternalSurface composable to render media, together with stereoscopic movies.

Utilizing Jetpack Compose for XR, now you can additionally outline layouts that adapt to totally different XR show configurations. For instance, use a SubspaceModifier to specify the dimensions of a Subspace as a proportion of the machine’s really useful viewing measurement, so a panel effortlessly fills the area it is positioned in.

Materials Design for XR now helps extra element overrides for TopAppBar, AlertDialog, and ListDetailPaneScaffold, serving to your large-screen enabled apps that use Materials Design effortlessly adapt to the brand new world of XR.

An app adapts to XR using Material Design for XR with the new component overrides

An app adapts to XR utilizing Materials Design for XR with the brand new element overrides

In ARCore for Jetpack XR, now you can observe palms after requesting the suitable permissions. Arms are a set of 26 posed hand joints that can be utilized to detect hand gestures and produce a complete new stage of interplay to your Android XR apps:

moving image demonstrates how hands bring a natural input method to your Android XR experience.

Arms carry a pure enter methodology to your Android XR expertise.

For extra steering on growing apps for Android XR, try our Android XR Fundamentals codelab, the updates to our Whats up Android XR pattern undertaking, and a brand new model of JetStream with Android XR help.

The Android XR Emulator has additionally acquired updates to stability, help for AMD GPUs, and is now totally built-in throughout the Android Studio UI.

the Android XR Emulator in Android STudio

The Android XR Emulator is now built-in in Android Studio

Builders utilizing Unity have already efficiently created and ported present video games and apps to Android XR. Immediately, you possibly can improve to the Pre-Launch model 2 of the Unity OpenXR: Android XR package deal! This replace provides many efficiency enhancements comparable to help for Dynamic Refresh Price, which optimizes your app’s efficiency and energy consumption. Shaders made with Shader Graph now help SpaceWarp, making it simpler to make use of SpaceWarp to cut back compute load on the machine. Hand meshes are actually uncovered with occlusion, which allows reasonable hand visualization.

Try Unity’s improved Combined Actuality template for Android XR, which now consists of help for occlusion and protracted anchors.

We lately launched Android XR Samples for Unity, which show capabilities on the Android XR platform comparable to hand monitoring, airplane monitoring, face monitoring, and passthrough.

moving image of Google’s open-source Unity samples demonstrating platform features and showing how they’re implemented

Google’s open-source Unity samples show platform options and present how they’re carried out

The Firebase AI Logic for Unity is now in public preview! This makes it simple so that you can combine gen AI into your apps, enabling the creation of AI-powered experiences with Gemini and Android XR. The Firebase AI Logic totally helps Gemini’s capabilities, together with multimodal enter and output, and bi-directional streaming for immersive conversational interfaces. Constructed with manufacturing readiness in thoughts, Firebase AI Logic is built-in with core Firebase providers like App Verify, Distant Config, and Cloud Storage for enhanced safety, configurability, and information administration. Study extra about this on the Firebase weblog or go straight to the Gemini API utilizing Vertex AI in Firebase SDK documentation to get began.

Persevering with to construct the long run collectively

Our dedication to open requirements continues with the glTF Interactivity specification, in collaboration with the Khronos Group. which will probably be supported in glTF fashions rendered by Jetpack XR later this yr. Fashions utilizing the glTF Interactivity specification are self-contained interactive property that may have many pre-programmed behaviors, like rotating objects on a button press or altering the colour of a cloth over time.

Android XR will probably be obtainable first on Samsung’s Mission Moohan, launching later this yr. Quickly after, our companions at XREAL will launch the subsequent Android XR machine. Codenamed Mission Aura, it’s a transportable and tethered machine that offers customers entry to their favourite Android apps, together with these which have been constructed for XR. It would launch as a developer version, particularly so that you can start creating and experimenting. The perfect information? With the acquainted instruments you utilize to construct Android apps immediately, you possibly can construct for these gadgets too.

product image of XREAL’s Project Aura against a nebulous black background

XREAL’s Mission Aura

The Google Play Retailer can be preparing for Android XR. It would record supported 2D Android apps on the Android XR Play Retailer when it launches later this yr. In case you are engaged on an Android XR differentiated app, you will get it prepared for the massive launch and be one of many first differentiated apps on the Android XR Play Retailer:

And we all know a lot of you’re excited for the way forward for Android XR on glasses. We’re shaping the developer expertise now and can share extra particulars on how one can take part later this yr.

To get began creating and growing for Android XR, try developer.android.com/develop/xr the place you can find the entire instruments, libraries, and sources you have to work with the Android XR SDK. Specifically, check out our samples and codelabs.

We welcome your suggestions, ideas, and concepts as you’re serving to form Android XR. Your ardour, experience, and daring concepts are very important as we proceed to develop Android XR collectively. We sit up for seeing your XR-differentiated apps when Android XR gadgets launch later this yr!

Discover this announcement and all Google I/O 2025 updates on io.google beginning Might 22.