13.4 C
New York
Sunday, March 16, 2025
Home Blog

ios – Tips on how to power Background Uploads with Firebase Storage utilizing Swift?


I’m importing small recordsdata to Firebase Storage, roughly 276KB a bit. The app can select to do a small handful of those uploads per day from the background. The specified habits is that each time the app decides to add the uploads can begin in background, and end whereas the app is in background with out the consumer needing to work together.

My present code is that this. This logic works in case your machine is in good circumstances. But when your machine is low energy, or working on mobile knowledge than the system appears to dam the add.

    public func uploadAudioToFirebase(fileName: String, audioPath: URL, completion: @escaping (End result) -> Void) {
    let storage = Storage.storage()
    let storageRef = storage.reference()
    
    let audioName = fileName + ".m4a"
    let audioRef = storageRef.little one("audios/" + audioName)
    
    let metadata = StorageMetadata()
    metadata.contentType = "audio/x-m4a"
    
    audioRef.putFile(from: audioPath, metadata: metadata) { metadata, error in
        if let error = error {
            completion(.failure(error))
            return
        }
        
        guard let uploadMetadata = metadata else {
            completion(.failure(NSError(area: "MetadataError", code: -1, userInfo: [NSLocalizedDescriptionKey: "No metadata available"])))
            return
        }     
            
        audioRef.downloadURL { (url, error) in
        guard let downloadURL = url else {
            completion(.failure(error!))
            return
        }
            
        completion(.success(downloadURL.absoluteString))

        }
    }
}

First I attempted to create even smaller recordsdata to add. I examined with file round 145KB in measurement, however this didn’t appear to alter the habits.

I believed to make use of the Firebase REST API and URLSessions or NSURLSession with discretionary set to false, to take management of when the system chooses to add. However in response to the docs, if the add begins from the background the system will mechanically set discretionary to true. Giving management again to the system to decide on when the add occurs.

I’ve tried wrapping my Storage add logic in a beginBackgroundTask. However so far as I can inform that is having the identical habits as if it was not there.

    public func uploadAudioToFirebase(fileName: String, audioPath: URL, completion: @escaping (End result) -> Void) {
    var taskId: UIBackgroundTaskIdentifier = .invalid
    taskId = UIApplication.shared.beginBackgroundTask(withName: "AudioUpload") {
        UIApplication.shared.endBackgroundTask(taskId)
        taskId = .invalid
    }
    
    let storage = Storage.storage()
    let storageRef = storage.reference()
    let audioName = fileName + ".m4a"
    let audioRef = storageRef.little one("audios/(audioName)")
    
    let metadata = StorageMetadata()
    metadata.contentType = "audio/x-m4a"
    
    audioRef.putFile(from: audioPath, metadata: metadata) { metadata, error in
        if let error = error {
            completion(.failure(error))
            if taskId != .invalid {
                UIApplication.shared.endBackgroundTask(taskId)
            }
            return
        }
        
        audioRef.downloadURL { (url, error) in
            if let error = error {
                completion(.failure(error))
            } else if let downloadURL = url {
                completion(.success(downloadURL.absoluteString))
            }
            
            if taskId != .invalid {
                UIApplication.shared.endBackgroundTask(taskId)
            }
        }
    }
}

My subsequent concept was to make use of the BGTaskScheduler with BGAppRefreshTask. However I do not know if it will really change the present habits. So far as I can inform BGAppRefreshTask will let the system select when to periodically do small duties. Which I feel would be the identical time that it’s already selecting to, ie when it has WiFi and good battery life.

It makes it more durable that I do not know the way the Firebase SDK putFile works below the hood. It appears to do some form of background administration already, it might simply utilizing the BGTaskScheduler already.

I’m at a roadblock now, I’m not positive find out how to proceed. I’ve learn that you just would possibly be capable to use Silent Push Notifications for one thing like this? This has limitation although, and I’m not sure how nicely it really works with a Firebase add.

Is there a extra straight ahead approach to power a small add that I’m lacking?

Learn how to stop persistent irritation from zombie-like cells that accumulate with age – NanoApps Medical – Official web site


In people and different multicellular organisms, cells multiply. This defining function permits embryos to develop into maturity, and allows the therapeutic of the various bumps, bruises and scrapes alongside the way in which.

Sure components could cause cells to desert this attribute and enter a zombie-like state referred to as senescence the place they persist however not divide to make new cells. Our our bodies can take away these senescent cells that are inclined to pile up as we age. The older we get, nevertheless, the much less environment friendly our immune techniques turn out to be at doing so.

“Along with not rising and proliferating, the opposite hallmark of senescent cells is that they’ve this inflammatory program inflicting them to secrete inflammatory molecules,” mentioned Peter Adams, Ph.D., director and professor of the Most cancers Genome and Epigenetics Program at Sanford Burnham Prebys and senior and co-corresponding writer of the examine.

Cells “working” this inflammatory program are thought-about to exhibit the senescence-associated secretory phenotype (SASP). Too many cells with SASP secreting  can contribute to  within the physique. This pervasive irritation—known as “inflammaging”—has been linked to many age-related ailments.

Scientists at Sanford Burnham Prebys and collaborators throughout the nation printed findings in Nature Communications exhibiting that the mitochondria powering our cells additionally management the flexibility of a DNA restore protein to suppress SASP, which can scale back or delay inflammaging.

The analysis workforce turned human cells senescent by exposing them to radiation after which used these cells to show that DNA fixer tumor protein p53 suppressed SASP and one among its triggering occasions, the formation of cytoplasmic chromatin fragments (CCF).

These fragments are bits of broken DNA which were spewed from the cells’ nuclei into the gel-like cytoplasm that occupies the area within the cell between the outer membrane and central nucleus. The presence of DNA the place it doesn’t belong can set off the immune system and contribute to SASP.

The scientists validated their findings in mice by treating them with a drug developed by most cancers researchers to activate p53 as a means of suppressing tumors. In aged mice, the drug didn’t scale back the variety of senescent cells however as a substitute reversed the mobile signature that marks age-associated SASP, probably stopping the inflammatory air pollution that may result in inflammaging.

As well as, the investigators found that senescent cells undergo from dysfunction within the mitochondria, serving as cells’ main supply of power. Burdened mitochondria could cause senescent cells to kind CCF and dampen the expression of the gene carrying the blueprint for p53.

“Altogether, we’ve recognized a mobile circuit able to selling DNA restore and genome integrity whereas suppressing the harmful inflammatory function of  that contribute to ,” mentioned Karl Miller, Ph.D., workers scientist within the Adams lab at Sanford Burnham Prebys and lead and co-corresponding writer of the examine.

“We even have proven that this pathway might be modified by present medication in cultured cells and mice, so it might be potential to sooner or later design a remedy that targets p53 to advertise more healthy growing older.”

Extra data: Karl N. Miller et al, p53 enhances DNA restore and suppresses cytoplasmic chromatin fragments and irritation in senescent cells, Nature Communications (2025). DOI: 10.1038/s41467-025-57229-3

Vingroup’s Bold Inexperienced Imaginative and prescient Finds a New Frontier in Indonesia



Join day by day information updates from CleanTechnica on e mail. Or observe us on Google Information!


Only a week after VinFast launched the VF3 to the Indonesian market, and simply over a yr after the model was launched, VinFast CEO Pham Nhat Vuong, who can be Chairman of Vingroup, met with Indonesia President Prabowo Subianto to debate a shared dedication to sustainable improvement.

The assembly final March 10, held throughout Vietnam’s Basic Secretary To Lam’s state go to to Indonesia, marked a big step in Vingroup’s rising presence in Southeast Asia’s largest financial system.

The discussions centered on VinFast’s formidable plans to determine a strong electrical automobile (EV) ecosystem in Indonesia. With a proposed funding of 4 trillion Rupiah, the corporate goals to assemble an EV meeting plant in Subang, West Java, able to producing 50,000 autos yearly. Complementing this effort is a plan to deploy as much as 100,000 EV charging stations throughout the archipelago, a transfer that aligns with Indonesia’s aspirations to change into a regional hub for EV manufacturing and infrastructure.

Nonetheless, VinFast’s initiatives are only one aspect of Vingroup’s broader technique in Indonesia.

“VinFast additionally expressed curiosity in renewable power investments, together with wind and solar energy, though these discussions are nonetheless within the early levels. For now, their important focus stays on the electrical automobile trade, which was additionally the first subject throughout their dialogue with the President,” Indonesia’s Funding Minister, Rosan Roeslani, informed MetroTV, an Indonesian TV station.

Vingroup is learning to arrange energy vegetation in West Nusa Tenggara and wind farms in Sulawesi. These ventures purpose to harness Indonesia’s considerable pure sources to drive a inexperienced transformation.

VinFast goals to determine 100,000 electrical automobile charging stations all through Indonesia, in response to Jakarta’s funding minister. Throughout that very same state go to, Roeslani confirmed to the Agence France-Presse (AFP) that VinFast will implement the charging station community incrementally, although particular timelines weren’t disclosed. This initiative underscores the EV producer’s strategic enlargement into Southeast Asia’s largest financial system. Given Indonesia’s huge nickel reserves, the nation is actively pursuing its aim to change into a pivotal regional EV hub and a key contributor to the worldwide EV provide chain.

The assembly between Subianto and Vuong additionally highlighted the strategic significance of Indonesia in Vingroup’s international enlargement. As Southeast Asia’s largest financial system and a key participant within the international EV provide chain, Indonesia presents a fertile floor for Vingroup’s progressive ventures. The nation’s huge nickel reserves, important for EV battery manufacturing, additional improve its enchantment as a strategic associate.

Whether or not you’ve solar energy or not, please full our newest solar energy survey.



Chip in a couple of {dollars} a month to assist assist unbiased cleantech protection that helps to speed up the cleantech revolution!


Have a tip for CleanTechnica? Wish to promote? Wish to recommend a visitor for our CleanTech Discuss podcast? Contact us right here.


Join our day by day publication for 15 new cleantech tales a day. Or join our weekly one if day by day is just too frequent.


Commercial



 


CleanTechnica makes use of affiliate hyperlinks. See our coverage right here.

CleanTechnica’s Remark Coverage




Drawback with construct Flutter as ios-framework on Xcode 16


I’ve a undertaking in Flutter with few Views witch are append to native app.
Earlier than replace OS and xcode I constructed on xcode 15 and large sur and works wonderful. After replace to os model sequoia and xcode 16.2 the enjoyable has begun. I as soon as managed to create a working configuration in Runner.xcproj for a pod, however after updating the repository and reloading the libraries I misplaced every thing.

Flutter physician:

[✓] Flutter (Channel secure, 3.29.2, on macOS 15.3.2 24D81 darwin-arm64, locale pl-PL) [441ms]
    • Flutter model 3.29.2 on channel secure at /Customers/Person/Paperwork/DEVEL/flutter-sdk
    • Upstream repository https://github.com/flutter/flutter.git
    • Framework revision c236373904 (3 days in the past), 2025-03-13 16:17:06 -0400
    • Engine revision 18b71d647a
    • Dart model 3.7.2
    • DevTools model 2.42.3

[✓] Android toolchain - develop for Android units (Android SDK model 33.0.1) [1 303ms]
    • Android SDK at /Customers/Person/Library/Android/sdk
    • Platform android-33, build-tools 33.0.1
    • Java binary at: /Functions/Android Studio.app/Contents/jbr/Contents/Dwelling/bin/java
      That is the JDK bundled with the newest Android Studio set up on this machine.
      To manually set the JDK path, use: `flutter config --jdk-dir="path/to/jdk"`.
    • Java model OpenJDK Runtime Setting (construct 21.0.4+-12422083-b607.1)
    • All Android licenses accepted.

[✓] Xcode - develop for iOS and macOS (Xcode 16.2) [796ms]
    • Xcode at /Functions/Xcode.app/Contents/Developer
    • Construct 16C5032a
    • CocoaPods model 1.16.2

[✓] Android Studio (model 2024.2) [11ms]
    • Android Studio at /Functions/Android Studio.app/Contents
    • Flutter plugin will be put in from:
      🔨 https://plugins.jetbrains.com/plugin/9212-flutter
    • Dart plugin will be put in from:
      🔨 https://plugins.jetbrains.com/plugin/6351-dart
    • Java model OpenJDK Runtime Setting (construct 21.0.4+-12422083-b607.1)

Flutter libs with issues:

- sentry_flutter: 8.14.0
- file_selector: 1.0.3

Once I name flutter construct ios-framework --cocoapods I recieve errors, often from sentry and file selector. But when I construct undertaking in xcode (Runner.xcproj) on my iphone, works wonderful. (with success).
I attempted with totally different flutter variations, sentry variations, with eliminated sentry (file_selector was related error). ChatGPT and gemini did not assist.

[  +27 ms] Unable to construct plugin frameworks: ** BUILD FAILED **
           
           
           The next construct instructions failed:
                CompileC /Customers/Person/Paperwork/DEVEL/flutter/cell/construct/ios/framework/Debug/iphoneos/Pods.construct/Debug-iphoneos/Sentry.construct/Objects-normal/arm64/SentryThreadHandle.o /Customers/Person/Paperwork/DEVEL/flutter/cell/.ios/Pods/Sentry/Sources/Sentry/SentryThreadHandle.cpp regular arm64 c++ com.apple.compilers.llvm.clang.1_0.compiler (in goal 'Sentry' from undertaking 'Pods')
           (1 failure)
           
[   +1 ms] 
           #0      throwToolExit (bundle:flutter_tools/src/base/widespread.dart:34:3)
           #1      BuildIOSFrameworkCommand._producePlugins (bundle:flutter_tools/src/instructions/build_ios_framework.dart:550:9)
           
           #2      BuildIOSFrameworkCommand.runCommand (bundle:flutter_tools/src/instructions/build_ios_framework.dart:292:9)
           
           #3      FlutterCommand.run. (bundle:flutter_tools/src/runner/flutter_command.dart:1558:27)
           
           #4      AppContext.run. (bundle:flutter_tools/src/base/context.dart:154:19)
           
           #5      CommandRunner.runCommand (bundle:args/command_runner.dart:212:13)
           
           #6      FlutterCommandRunner.runCommand. (bundle:flutter_tools/src/runner/flutter_command_runner.dart:496:9)
           
           #7      AppContext.run. (bundle:flutter_tools/src/base/context.dart:154:19)
           
           #8      FlutterCommandRunner.runCommand (bundle:flutter_tools/src/runner/flutter_command_runner.dart:431:5)
           
           #9      run.. (bundle:flutter_tools/runner.dart:98:11)
           
           #10     AppContext.run. (bundle:flutter_tools/src/base/context.dart:154:19)
           
           #11     most important (bundle:flutter_tools/executable.dart:99:3)
           

or

url_launcher_ios.construct/DerivedSources/url_launcher_ios_vers.c regular arm64 c
                com.apple.compilers.llvm.clang.1_0.compiler (in goal 'url_launcher_ios' from undertaking 'Pods')

Meta AI’s MILS: A Recreation-Changer for Zero-Shot Multimodal AI

0


For years, Synthetic Intelligence (AI) has made spectacular developments, nevertheless it has at all times had a basic limitation in its lack of ability to course of several types of information the way in which people do. Most AI fashions are unimodal, that means they concentrate on only one format like textual content, pictures, video, or audio. Whereas sufficient for particular duties, this method makes AI inflexible, stopping it from connecting the dots throughout a number of information varieties and really understanding context.

To unravel this, multimodal AI was launched, permitting fashions to work with a number of types of enter. Nevertheless, constructing these programs just isn’t simple. They require large, labelled datasets, which aren’t solely exhausting to seek out but in addition costly and time-consuming to create. As well as, these fashions normally want task-specific fine-tuning, making them resource-intensive and troublesome to scale to new domains.

Meta AI’s Multimodal Iterative LLM Solver (MILS) is a improvement that adjustments this. Not like conventional fashions that require retraining for each new activity, MILS makes use of zero-shot studying to interpret and course of unseen information codecs with out prior publicity. As a substitute of counting on pre-existing labels, it refines its outputs in real-time utilizing an iterative scoring system, constantly enhancing its accuracy with out the necessity for extra coaching.

The Downside with Conventional Multimodal AI

Multimodal AI, which processes and integrates information from varied sources to create a unified mannequin, has immense potential for reworking how AI interacts with the world. Not like conventional AI, which depends on a single sort of information enter, multimodal AI can perceive and course of a number of information varieties, comparable to changing pictures into textual content, producing captions for movies, or synthesizing speech from textual content.

Nevertheless, conventional multimodal AI programs face vital challenges, together with complexity, excessive information necessities, and difficulties in information alignment. These fashions are usually extra advanced than unimodal fashions, requiring substantial computational assets and longer coaching instances. The sheer number of information concerned poses severe challenges for information high quality, storage, and redundancy, making such information volumes costly to retailer and expensive to course of.

To function successfully, multimodal AI requires giant quantities of high-quality information from a number of modalities, and inconsistent information high quality throughout modalities can have an effect on the efficiency of those programs. Furthermore, correctly aligning significant information from varied information varieties, information that symbolize the identical time and area, is advanced. The combination of information from totally different modalities is advanced, as every modality has its construction, format, and processing necessities, making efficient mixtures troublesome. Moreover, high-quality labelled datasets that embrace a number of modalities are sometimes scarce, and amassing and annotating multimodal information is time-consuming and costly.

Recognizing these limitations, Meta AI’s MILS leverages zero-shot studying, enabling AI to carry out duties it was by no means explicitly educated on and generalize data throughout totally different contexts. With zero-shot studying, MILS adapts and generates correct outputs with out requiring extra labelled information, taking this idea additional by iterating over a number of AI-generated outputs and enhancing accuracy by means of an clever scoring system.

Why Zero-Shot Studying is a Recreation-Changer

One of the vital developments in AI is zero-shot studying, which permits AI fashions to carry out duties or acknowledge objects with out prior particular coaching. Conventional machine studying depends on giant, labelled datasets for each new activity, that means fashions have to be explicitly educated on every class they should acknowledge. This method works properly when loads of coaching information is offered, nevertheless it turns into a problem in conditions the place labelled information is scarce, costly, or unimaginable to acquire.

Zero-shot studying adjustments this by enabling AI to use current data to new conditions, very similar to how people infer that means from previous experiences. As a substitute of relying solely on labelled examples, zero-shot fashions use auxiliary data, comparable to semantic attributes or contextual relationships, to generalize throughout duties. This capacity enhances scalability, reduces information dependency, and improves adaptability, making AI way more versatile in real-world purposes.

For instance, if a standard AI mannequin educated solely on textual content is all of the sudden requested to explain a picture, it could battle with out specific coaching on visible information. In distinction, a zero-shot mannequin like MILS can course of and interpret the picture with no need extra labelled examples. MILS additional improves on this idea by iterating over a number of AI-generated outputs and refining its responses utilizing an clever scoring system.

This method is especially beneficial in fields the place annotated information is proscribed or costly to acquire, comparable to medical imaging, uncommon language translation, and rising scientific analysis. The power of zero-shot fashions to shortly adapt to new duties with out retraining makes them highly effective instruments for a variety of purposes, from picture recognition to pure language processing.

How Meta AI’s MILS Enhances Multimodal Understanding

Meta AI’s MILS introduces a wiser method for AI to interpret and refine multimodal information with out requiring intensive retraining. It achieves this by means of an iterative two-step course of powered by two key elements:

  • The Generator: A Giant Language Mannequin (LLM), comparable to LLaMA-3.1-8B, that creates a number of potential interpretations of the enter.
  • The Scorer: A pre-trained multimodal mannequin, like CLIP, evaluates these interpretations, rating them based mostly on accuracy and relevance.

This course of repeats in a suggestions loop, constantly refining outputs till probably the most exact and contextually correct response is achieved, all with out modifying the mannequin’s core parameters.

What makes MILS distinctive is its real-time optimization. Conventional AI fashions depend on fastened pre-trained weights and require heavy retraining for brand new duties. In distinction, MILS adapts dynamically at check time, refining its responses based mostly on quick suggestions from the Scorer. This makes it extra environment friendly, versatile, and fewer depending on giant labelled datasets.

MILS can deal with varied multimodal duties, comparable to:

  • Picture Captioning: Iteratively refining captions with LLaMA-3.1-8B and CLIP.
  • Video Evaluation: Utilizing ViCLIP to generate coherent descriptions of visible content material.
  • Audio Processing: Leveraging ImageBind to explain sounds in pure language.
  • Textual content-to-Picture Era: Enhancing prompts earlier than they’re fed into diffusion fashions for higher picture high quality.
  • Model Switch: Producing optimized enhancing prompts to make sure visually constant transformations.

Through the use of pre-trained fashions as scoring mechanisms somewhat than requiring devoted multimodal coaching, MILS delivers highly effective zero-shot efficiency throughout totally different duties. This makes it a transformative method for builders and researchers, enabling the mixing of multimodal reasoning into purposes with out the burden of in depth retraining.

How MILS Outperforms Conventional AI

MILS considerably outperforms conventional AI fashions in a number of key areas, notably in coaching effectivity and value discount. Typical AI programs usually require separate coaching for every sort of information, which calls for not solely intensive labelled datasets but in addition incurs excessive computational prices. This separation creates a barrier to accessibility for a lot of companies, because the assets required for coaching might be prohibitive.

In distinction, MILS makes use of pre-trained fashions and refines outputs dynamically, considerably decreasing these computational prices. This method permits organizations to implement superior AI capabilities with out the monetary burden usually related to intensive mannequin coaching.

Moreover, MILS demonstrates excessive accuracy and efficiency in comparison with current AI fashions on varied benchmarks for video captioning. Its iterative refinement course of permits it to supply extra correct and contextually related outcomes than one-shot AI fashions, which regularly battle to generate exact descriptions from new information varieties. By constantly enhancing its outputs by means of suggestions loops between the Generator and Scorer elements, MILS ensures that the ultimate outcomes are usually not solely high-quality but in addition adaptable to the precise nuances of every activity.

Scalability and flexibility are extra strengths of MILS that set it aside from conventional AI programs. As a result of it doesn’t require retraining for brand new duties or information varieties, MILS might be built-in into varied AI-driven programs throughout totally different industries. This inherent flexibility makes it extremely scalable and future-proof, permitting organizations to leverage its capabilities as their wants evolve. As companies more and more search to learn from AI with out the constraints of conventional fashions, MILS has emerged as a transformative resolution that enhances effectivity whereas delivering superior efficiency throughout a variety of purposes.

The Backside Line

Meta AI’s MILS is altering the way in which AI handles several types of information. As a substitute of counting on large labelled datasets or fixed retraining, it learns and improves as it really works. This makes AI extra versatile and useful throughout totally different fields, whether or not it’s analyzing pictures, processing audio, or producing textual content.

By refining its responses in real-time, MILS brings AI nearer to how people course of data, studying from suggestions and making higher choices with every step. This method isn’t just about making AI smarter; it’s about making it sensible and adaptable to real-world challenges.