Home Blog

ios – SwiftUI Picture Sharing: Screenshot doesn’t load on first sharing try, however masses afterwards


I’ve a easy solitaire recreation the place I wish to share a screenshot of successful arms by way of the share sheet. I seize the screenshot effective, however after I go to share (primarily testing by way of Gmail), the primary time the share sheet for Gmail pops up, the picture is just not connected. If I shut the Gmail message and take a look at once more, it masses the picture. Is there some kind of delay I must account for within the rendering course of? I’ve learn a bunch of different threads which have every had components of the identical concern I’ve, however nothing conclusive, and I’ve not been capable of repair it. ShareLink seems to be like it could be a very good choice, however I can’t appear to get it to work, because it crashes my preview each time. Right here is my snapshot (screenshot) operate:

extension WinningHandView {
func snapshot(origin: CGPoint = .zero, measurement: CGSize = .zero) -> UIImage {
    let controller = UIHostingController(rootView: self)
    let view = controller.view

    let targetSize = measurement == .zero ? controller.view.intrinsicContentSize : measurement
    view?.backgroundColor = .clear
    view?.bounds = CGRect(origin: origin, measurement: targetSize)

    let renderer = UIGraphicsImageRenderer(measurement: targetSize)

    return renderer.picture { _ in
        view?.drawHierarchy(in: controller.view.bounds, afterScreenUpdates: true)
    }
  }

}

Here is the Button I am utilizing to launch the Share sheet:

Button("Share", motion: {
                        let picture = self.snapshot()
                        //let sharingImage = Picture(uiImage: picture).renderingMode(.unique)*/
                        
                        let activityVC = UIActivityViewController(activityItems: [image], applicationActivities: nil)
                                    let _: Void? = UIApplication.shared.connectedScenes.map({ $0 as? UIWindowScene }).compactMap({ $0 }).first?.home windows.first?.rootViewController?.current(activityVC, animated: true, completion: nil)})

It looks like some kind of race situation/delay wanted; if that was the case, how would I implement that? I’ve solely been working with SwiftUi for a few weeks, so nonetheless very a lot studying the ropes.

Agentic AI at Glean with Eddie Zhou


Glean is a office search and information discovery firm that helps organizations discover and entry data throughout varied inside instruments and knowledge sources. Their platform makes use of AI to supply personalised search outcomes to help members of a corporation in retrieving related paperwork, emails, and conversations. The rise of LLM-based agentic reasoning programs now presents new alternatives to construct superior performance utilizing a corporation’s inside knowledge.

Eddie Zhou is a founding engineer at Glean and beforehand labored at Google. He joined Sean Falconer to debate the engineering and design concerns round constructing agentic tooling to reinforce productiveness and decision-making.

 

Sean’s been an educational, startup founder, and Googler. He has revealed works protecting a variety of matters from AI to quantum computing. At present, Sean is an AI Entrepreneur in Residence at Confluent the place he works on AI technique and thought management. You’ll be able to join with Sean on LinkedIn.

Sponsors

This episode is sponsored by Mailtrap – an E mail Platform builders love.

Go for top deliverability, trade finest analytics, and dwell 24/7 assist.

Get 20% off for all plans with our promo code SEDAILY.
Try Mailtrap.io to enroll.

Builders, we’ve all been there… It’s 3 AM and your telephone blares, jolting you awake. One other alert. You scramble to troubleshoot, however the complexity of your microservices atmosphere makes it practically inconceivable to pinpoint the issue shortly.

That’s why Chronosphere is on a mission that will help you take again management with Differential Prognosis, a brand new distributed tracing function that takes the guesswork out of troubleshooting. With only one click on, DDx routinely analyzes all spans and dimensions associated to a service, pinpointing the most definitely reason for the problem.

Don’t let troubleshooting drag you into the early hours of the morning. Simply “DDx it” and resolve points quicker.

See why Chronosphere was named a pacesetter within the 2024 Gartner Magic Quadrant for Observability Platforms at chronosphere.io/sed.

This episode of Software program Engineering Day by day is dropped at you by Capital One.

How does Capital One stack? It begins with utilized analysis and leveraging knowledge to construct AI fashions. Their engineering groups use the ability of the cloud and platform standardization and automation to embed AI options all through the enterprise. Actual-time knowledge at scale permits these proprietary AI options to assist Capital One enhance the monetary lives of its clients. That’s expertise at Capital One.

Be taught extra about how Capital One’s fashionable tech stack, knowledge ecosystem, and utility of AI/ML are central to the enterprise by visiting www.capitalone.com/tech.

Huawei set to ship 910C AI chips at scale, signaling shift in world AI provide chain



“From a efficiency standpoint, Nvidia’s new-generation chips — such because the B200 and the upcoming B300 Extremely, based mostly on TSMC’s 4nm course of and outfitted with superior HBM3/3E reminiscence — have considerably widened the hole in comparison with Huawei’s 910C, which is probably going constructed on SMIC’s N+2 7nm course of (successfully 14nm) and lacks superior HBM reminiscence,” mentioned Neil Shah, companion and co-founder at Counterpoint Analysis.

Shah famous that whereas Huawei’s chip might theoretically match the older Nvidia A100 or H100 in some duties, it could require extra energy and heavy software program optimization to deal with numerous AI workloads.

“International adoption of Huawei’s 910C can be hindered by restricted developer help, ecosystem maturity, and integration challenges,” mentioned Manish Rawat, semiconductor analyst at Techinsights. “Nonetheless, it presents a viable different to Nvidia’s chips for Chinese language enterprises or these affected by geopolitical constraints, particularly as US export controls restrict entry to superior Nvidia GPUs.”

Implications on enterprise AI adoption

Regardless of not being the perfect out there, the 910C might nonetheless show viable for a lot of enterprise- and hyperscale-AI use circumstances.

It might take longer to coach fashions in comparison with US-designed chips, however for a lot of, it’s a suitable trade-off given present geopolitical and provide chain dangers.

“For enterprises – notably these working in or sourcing from China – it presents a reputable different within the face of tightening US export controls, whereas supporting home innovation ecosystems like DeepSeek and decreasing dependence on international applied sciences,” mentioned Prabhu Ram, VP of the business analysis group at Cybermedia Analysis.  “Though Nvidia maintains an edge in software program maturity and vitality effectivity, Huawei’s progress displays China’s rising power in competing on the forefront of AI {hardware} within the rising AI period.”

If firms working in or sourcing from China undertake Huawei’s ecosystem, it might considerably affect their procurement methods, vendor choice, and know-how evaluations, driving larger alignment with Chinese language applied sciences.

ios – Swift Knowledge don’t save Mannequin


i’ve created a small Utility that may learn in a Gross sales Examine Image from a Vendor. On this case Rewe. Loading the Knowledge and creating within the SalesCheckView is not any Drawback and the attempt catch block don’t throws an error. However once I go into the SalesCheckListView I do get an empty Swift Knowledge question for my SalesCheck. I’ve different Querys in different Views that do work however this are not looking for what I would like. Perhaps somebody sees my mistake. Thanks in andvance.

My App appears to be like like this:

Foremost

import SwiftUI
import SwiftData

@fundamental
struct TestAppApp: App {
    
    var physique: some Scene {
        WindowGroup {
            ContentView()
        }.modelContainer(for: [ReceiptModel.self, FoodModel.self, ReceiptFoodQuantity.self, SalesCheck.self])
    }
}

Content material:

import SwiftUI

struct ContentView: View {
    var physique: some View {
        MainHub()
    }
}

#Preview {
    ContentView()
}

Hub

import SwiftUI

struct MainHub: View {
    var physique: some View {
        NavigationStack{
            NavigationLink("Recepits") {
                ReceiptAddView()
            }
            NavigationLink("Meals") {
                FoodAddView()
            }
            NavigationLink("Add Gross sales Examine") {
                SalesCheckView()
            }
            NavigationLink("Gross sales Examine") {
                SalesCheckListView()
            }
        }
    }
}

#Preview {
    MainHub()
}

SalesCheckView

import SwiftUI
import Imaginative and prescient



@Observable
class ItemPricingModel: Identifiable{
    var id : UUID
    var identify: String
    var value: String
    
    init(identify: String, value: String) {
        self.id = UUID()
        self.identify = identify
        self.value = value
    }
}

struct SalesCheckView: View {
    @Surroundings(.dismiss) non-public var dismiss
    @Surroundings(.modelContext) var context
    @State non-public var regonizedText = ""
    @State non-public var itemPricingModelList: [ItemPricingModel] = []
    var physique: some View {
        VStack {
            Picture(.test2)
                .resizable()
                .aspectRatio(contentMode: .match)
            
            Button("acknowledge"){
                print("pressed")
                loadPhoto()
            }
            Spacer()
            ShoppedListView(itemPricingModelList: $itemPricingModelList)
            Button("Save Gross sales Examine"){
                let salesCheck = SalesCheck(namePricingModel: itemPricingModelList,vendorName: "Rewe")
                print(salesCheck)
                context.insert(salesCheck)
                do {
                    attempt context.save()
                    dismiss()
                }
                catch { print ("Error saving context: (error)")
                }
            }
        }
        .padding()
        
        
    }
    
    non-public func loadPhoto() {
        recognizeText()
        
    }
    
    non-public func recognizeText() {
        let picture = UIImage(useful resource: .test2)
        guard let cgImage = picture.cgImage else { return }
        
        let handler = VNImageRequestHandler(cgImage: cgImage)
        
        let request = VNRecognizeTextRequest { request, error in
            guard error == nil else {
                print(error?.localizedDescription ?? "")
                return
            }
            
            guard let end result = request.outcomes as? [VNRecognizedTextObservation] else { return }
            
            let recogArr = end result.compactMap {end in end result.topCandidates(1).first?.string
            }
            
            DispatchQueue.fundamental.async {
                self.itemPricingModelList = ReweSalesCheckAnalyzer(scannedSalesCheck: recogArr).analyze()
            }
        }
        
        request.recognitionLevel = .correct
        do {
            attempt handler.carry out([request])
        } catch {
            print(error.localizedDescription)
        }
    }
}



#Preview {
    SalesCheckView()
}

SalesCheckModel

import Basis
import SwiftData



@Mannequin
class SalesCheck{
    @Attribute(.distinctive) var id: UUID
    var nameString: String
    var value: String
    var venodrName: String
    
    init(namePricingModel: [ItemPricingModel], vendorName: String) {
        var nameString: String = ""
        var priceString: String = ""
        
        self.id = UUID()
        
        for (i,merchandise) in namePricingModel.enumerated(){
            if i==1{
                nameString += merchandise.identify
                priceString += merchandise.value
            }else{
                nameString += ", " + merchandise.identify
                priceString += ", " + merchandise.value
            }
        }
        
        
        self.nameString = nameString
        self.value = priceString
        self.venodrName = vendorName
    }
}

SalesCheckListView

import SwiftUI
import SwiftData

struct SalesCheckListView: View {
    @Question var salesCheckListQuery: [SalesCheck]
    var physique: some View {
        Button("teste"){
            print(salesCheckListQuery)
        }

        ForEach(salesCheckListQuery) { salesCheck in
            Textual content("Vendor  (salesCheck.nameString)")
        }
    }
}

ios – WKwebview doesn’t enable video being performed within the background


I’ve been engaged on a private iOS mission for enjoyable — basically a YouTube music participant, studying how background media playback works in native iOS apps.

After seeing that Musi (a well-known music streaming app) can play YouTube audio within the background with the display off — I bought actually curious. I’ve been making an attempt to duplicate that fundamental background audio performance for YouTube embeds utilizing WKWebView. I’ve spent a loopy period of time (most likely 20 hours) making an attempt to determine this out however have achieved no success.

Right here’s what I’ve tried thus far:

-Embedding a YouTube video in a WKWebView

-Activating AVAudioSession with .playback and setting .setActive(true)

-Including the UIBackgroundModes key with audio in Data.plist

-Including the NSAppTransportSecurity key to permit arbitrary masses

–Testing on an actual machine (iPhone 14, iOS 18.1 goal)–

What occurs:

Audio performs advantageous within the foreground.

If I exit the app and go to the lock display shortly sufficient (lower than 3 seconds) after urgent play, I can resume playback briefly from the lock display — however it doesn’t mechanically proceed like in Musi and different apps prefer it.

More often than not, the audio stops when the app is backgrounded.

I get this error constantly within the logs:

Error buying assertion:

It looks like the app lacks some particular entitlements associated to WebKit media playback. I don’t have AppDelegate/SceneDelegate (utilizing SwiftUI), however can add if wanted.

I’m tremendous curious how music streaming apps utilizing youtube as a supply get round this — are they doing one thing totally different beneath the hood? A customized participant? A SafariViewController trick? Is there a selected technique to configure WKWebView to maintain taking part in within the background, or is that this a recognized limitation?

Would actually admire any perception from people who’ve explored this earlier than or understand how apps like Musi pulled it off.

Thanks prematurely!

Enhance