Home Blog

Worms, Springs, and Smooth Robots: Tiny Creatures Encourage Large Leaps

0


Researchers at Georgia Tech just lately unveiled a powerful achievement: a 5-inch-long mushy robotic that may catapult itself 10 toes into the air – the peak of a basketball hoop – with none legs. The design was impressed by the standard nematode, a tiny roundworm thinner than a human hair that may bounce many occasions its physique size. 

By pinching its physique into tight kinks, the worm shops elastic power after which abruptly releases it, flinging itself skyward or backward like an acrobatic gymnast. The engineers mimicked this movement. Their “SoftJM” robotic is basically a versatile silicone rod with a stiff carbon-fiber spine. Relying on the way it bends, it might leap ahead or backward – although it has no wheels or legs.

In motion, the nematode-inspired robotic coils up very similar to an individual squatting, then explosively unbends to leap. A high-speed digital camera present how the worm curves its head up and kinks in the course of its physique to hop backward, then straightens and kinks on the tail to leap ahead. 

The Georgia Tech workforce discovered that these tight bends – usually an issue in hoses or cables – truly let the worm and the robotic retailer much more power. As one researcher famous, kinked straws or hoses are ineffective, however a kinked worm acts like a loaded spring. Within the lab, the mushy robotic reproduced this trick: it “pinches” its center or tail, tenses up, after which releases in a burst (about one-tenth of a millisecond) to soar into the air.

Smooth Robots on the Rise

Smooth robotics is a younger however quickly rising subject that usually takes cues from nature. Not like inflexible steel machines, mushy robots are made from versatile supplies that may squeeze, stretch and adapt to their environment. Early milestones within the subject embrace Harvard’s Octobot – an autonomous robotic made totally of silicone and fluid channels, with no inflexible elements, impressed by octopus muscular tissues. Since then, engineers have constructed a menagerie of soppy machines: from worm-like crawlers and jellified grippers to wearable “exo-suits” and rolling vine-like robots. 

For instance, Yale researchers created a turtle-inspired mushy robotic whose legs change between floppy flippers and agency “land legs” relying on whether or not it’s swimming or strolling. At UCSB, scientists made a vine-like robotic that grows towards gentle utilizing solely light-sensitive “pores and skin” – it actually extends itself via slender areas like a plant stem. These and different bio-inspired improvements present how mushy supplies can create new modes of motion.

General, supporters say mushy robots can go locations conventional robots can not. The U.S. Nationwide Science Basis notes that adaptive mushy machines “discover areas beforehand unreachable by conventional robots” – even contained in the human physique. Some mushy robots have programmable “skins” that change stiffness or coloration to mix in or grip objects. Engineers are additionally exploring origami/kirigami strategies, shape-memory polymers, and different tips so these robots can reconfigure on the fly.

Engineering Versatile Movement

Making a mushy robotic transfer like an animal comes with massive challenges. With out arduous joints or motors, designers should depend on materials properties and intelligent geometry. For instance, Georgia Tech’s jumper needed to embrace a carbon-fiber backbone inside its rubbery physique to make the spring motion highly effective sufficient. Integrating sensors and management methods can be tough. As Penn State engineers level out, conventional electronics are stiff and would freeze a mushy robotic in place.

To make their tiny crawling rescue robotic “sensible,” they needed to unfold versatile circuits fastidiously throughout the physique so it may nonetheless bend. Even discovering power sources is more durable: some mushy robots use exterior magnetic fields or pressurized air as a result of carrying a heavy battery would weigh them down.

The nematode-inspired mushy robots from Georgia Tech (Photograph: Candler Hobbs)

One other hurdle is exploiting the precise physics. The nematode-robot workforce realized that kinks truly assist. In a traditional rubber tube, a kink shortly stops move; however in a mushy worm it slow-builds inside stress, permitting far more bending earlier than launch. By experimenting with simulations and even water-filled balloon fashions, the researchers confirmed that their versatile physique may maintain plenty of elastic power when bent, then unleash it in a single quick hop. The result’s exceptional: from relaxation the robotic can bounce 10 toes excessive, repeatably, by merely flexing its backbone. These breakthroughs – discovering methods to retailer and launch power in rubbery supplies – are typical of soppy robotics engineering.

Actual-World Hoppers and Helpers

What are all these mushy robots good for? In precept, they will deal with conditions too harmful or awkward for inflexible machines. In catastrophe zones, as an illustration, mushy bots can wriggle below rubble or into collapsed buildings to seek out survivors. Penn State confirmed a prototype magnetically managed mushy crawler that would navigate tight particles and even transfer via blood-vessel-sized channels.

In drugs, microscopic mushy robots may ship medicine immediately within the physique. In a single MIT research, a thread-thin mushy robotic was envisioned to drift via arteries and clear clots, probably treating strokes with out open surgical procedure. Harvard scientists are engaged on mushy wearable exoskeletons too – a light-weight inflatable sleeve that helped ALS sufferers carry a shoulder, instantly enhancing their vary of movement.

Area businesses are additionally eyeing mushy leapers. Wheels can get caught on sand or rocks, however a hopping robotic may vault over craters and dunes. NASA is even imagining novel jumpers for the Moon and icy moons. In a single idea, a soccer-ball-sized bot referred to as SPARROW would use steam jets (from boiled ice) to hop many miles throughout Europa or Enceladus. Within the low gravity of these moons, a small bounce goes a really good distance – scientists be aware {that a} robotic’s one-meter leap on Earth may carry it 100 meters on Enceladus. The concept is that dozens of those hoppers may swarm throughout alien terrain “with full freedom to journey” the place wheeled rovers would stall. Again on Earth, future mushy jumpers may assist in search-and-rescue missions by leaping over rivers, mud, or unstable floor that might cease standard robots.

Smooth robots are additionally discovering work in trade and agriculture. NSF factors out they may turn into secure helpers on manufacturing facility flooring or on farms, as a result of they comply if a human is in the best way. Researchers have even constructed mushy grippers that lightly choose delicate fruit with out bruising it. The flexibleness of soppy machines means they will act in locations too small or versatile for inflexible units.

Ultimately, specialists imagine mushy robotics will essentially change many fields. From worms to wearable fits to lunar hoppers, this analysis thread exhibits how finding out tiny creatures can yield massive jumps in know-how.

ios – Presenting customized alert on prime of all of the views together with sheets or full display screen cowl


I’m looking for a working answer for this downside for a very long time however can not seem to get any options on SO working in my case. SwiftUI’s ‘.alert’ modifier appears to current itself on prime of all of the views, however the customization of the alert components is so restricted attributable to which I needed to create a customized alert. My alert comprises icons and styled buttons.

My app is advanced, so sharing a brief pattern working instance. The app presents a MainView and there’ll all the time be a sheet over it whose peak may be adjusted (typically full display screen cowl as an alternative of sheet). Alerts may be introduced by MainView and by any of the sheet views. In any case, I need alert to be all the time on prime of sheet view (or full display screen cowl)

Right here I attempted presenting an alert (take into account Coloration.black.opacity.. as an alert in my code under) in 2 other ways.

  • A technique is utilizing ZStack in ContentView and even in MainView. In each
    the circumstances, the AlertView opens behind SheetView and does not cowl
    SheetView.
  • Second method is utilizing an AlertViewModifier in SheetView.
    With this, the alert will get introduced throughout the SheetView (as an alternative of
    over SheetView) such that I’m nonetheless in a position to work together with the
    MainView and in addition in a position to regulate peak of SheetView.

Pattern code:

import SwiftUI

class SampleViewModel: ObservableObject {
    @Printed var showSheet = false
    @Printed var panelDetent = PresentationDetent.medium
    @Printed var showZStackAlert: Bool = false
}

struct ContentViewA: View {
    @StateObject var viewModel: SampleViewModel = .init()
    @State non-public var showSheet: Bool = false
    var physique: some View {
        NavigationStack {
            ZStack {
                if viewModel.showZStackAlert {
                    AlertView()
                }
                
                MainView()
            }
        }
        .environmentObject(viewModel)
    }
}

/// Presenting customized alert from foremost ContentView
struct AlertView: View {
    var physique: some View {
        VStack {
            Coloration.black.opacity(0.5).edgesIgnoringSafeArea(.all)
        }
    }
}

struct MainView: View {
    @EnvironmentObject var viewModel: SampleViewModel
    @State non-public var showSheet: Bool = false
    var physique: some View {
        ZStack {
//            if viewModel.showZStackAlert {
//                AlertView()
//            }
//            
            VStack {
                Textual content("That is foremost view with all the time one sheet displayed")
            }
        }
        .onAppear {
            self.showSheet = true
        }
        .sheet(isPresented: $showSheet) {
            SheetView().environmentObject(viewModel)
        }
        // Extra sheets
        //.sheet(isPresented: $showSheetA) {
            //SheetViewA().environmentObject(viewModel)
        //}
    }
}

struct SheetView: View {
    @EnvironmentObject var viewModel: SampleViewModel
    @Surroundings(.presentationMode) non-public var presentationMode
    @State non-public var showAlert: Bool = false
    
    var physique: some View {
        NavigationStack {
            ZStack {
                Coloration.mint.opacity(0.1).edgesIgnoringSafeArea(.all)

                VStack {
                    Button {
                        viewModel.showZStackAlert = true
                        // self.showAlert = true
                    } label: {
                        Textual content("Faucet me to open alert")
                    }
                }
            }
        }
        .presentationDetents([.medium, .large], choice: $viewModel.panelDetent)
        .interactiveDismissDisabled(true)
        .presentationBackgroundInteraction(.enabled)
        .alertView(isPresented: $showAlert)
    }
}

/// Presenting customized alert utilizing Alert View Modifier
struct AlertViewModifier: ViewModifier {
    @Binding var isPresented: Bool
    
    init(isPresented: Binding) {
        self._isPresented = isPresented
    }

    func physique(content material: Content material) -> some View {
        content material
            .animation(nil, worth: self.$isPresented.wrappedValue)
            .overlay(self.$isPresented.wrappedValue ? Coloration.black.opacity(0.5) : nil)
            .overlay(self.$isPresented.wrappedValue ? alertContent() : nil)
            .animation(.default, worth: self.$isPresented.wrappedValue)
    }
    
    @ViewBuilder
    non-public func alertContent() -> some View {
        GeometryReader { geometry in
            if self.$isPresented.wrappedValue {
                VStack {
                    /// Contents of alert view
                }
                .fixedSize(horizontal: false, vertical: true)
                .place(x: geometry.measurement.width/2, y: geometry.measurement.peak/2)
                .body(minWidth: 350.0, maxWidth: 350.0)
            }
        }
    }
}

extension View {
    func alertView(isPresented: Binding) -> some View {
        return modifier(AlertViewModifier(isPresented: isPresented))
    }
}

How can I current AlertView all the time on prime of sheet view?

Epson provides GX-C Collection with RC800A controller to its robotic line

0


Epson provides GX-C Collection with RC800A controller to its robotic line

The brand new GX-C robotic is designed for efficiency and adaptability. Supply: Epson Robots

Epson Robots, which claimed to be the the world’s No. 1 SCARA robotic producer, right this moment introduced its new GX-C sequence, which makes use of the RC800A controller and SafeSense know-how. The Los Alamitos, Calif.-based firm stated the GX4C, GX8C, GX10C and GX20C robots ship excessive efficiency and adaptability in a compact footprint.

“With producers striving for better productiveness and precision, the demand for superior automation options is quickly accelerating,” stated Rick Brookshire, director of robotics at Epson America Inc. “Engineered to tackle even probably the most advanced meeting duties, the brand new GX-C Collection gives a compelling mix of excessive efficiency, intuitive operation, and confirmed reliability — serving to producers drive effectivity and keep aggressive in a fast-changing business.”

Epson Robots stated it has bought greater than 150,000 SCARA and six-axis robots primarily based on a standard PC platform. The unit of Epson Group, led Japan-based Seiko Epson Corp., supplies robots for precision meeting and materials dealing with purposes in a number of industries. They embody aerospace, equipment, automotive, biotechnology, shopper product, electronics, meals processing, medical gadget, pharmaceutical, plastics, semiconductor, and telecommunications.


SITE AD for the 2025 Robotics Summit registration.
Register now so you do not miss out!


GX-C Collection is designed for ease of use

The GX-C Collection gives a number of arm configurations for elevated productiveness and employee safety whereas minimizing machine footprint, in response to Epson. It additionally options GYROPLUS vibration-reduction know-how to assist optimize workcell throughput with clean motions and quick settling occasions.

Designed for ease of use, the sequence consists of Epson RC+ 8.0, the most recent model of the corporate’s software program improvement platform. Extra GX-C Collection options embody:

  • Excessive efficiency and adaptability: Excessive throughput with max-rated payloads and a number of arm configurations to perform probably the most demanding meeting wants
  • Quick cycle occasions: Excessive acceleration, clean movement, and quick settling occasions can maximize throughput with proprietary GYROPLUS system, plus no ringing or overshoot, when operated inside specs
  • Ease of use: Epson asserted that its “intuitive and feature-packed” Epson RC+ improvement software program gives a easy consumer interface, an built-in debugger, and a complicated 3D simulator
  • Designed for reliability: The GX-C is designed to deal with workloads at quick speeds and excessive precision inside a small footprint
  • Low whole price of possession (TCO): The brand new SCARA robots embody options to scale back assist and set up prices: batteryless encoders, a built-in Ethernet cable, Epson RC+ software program, and a wealthy suite of security options
  • Accelerated improvement of workcell purposes: These are totally built-in; non-compulsory options embody imaginative and prescient steering, elements feeding, pressure steering, conveyer monitoring and fieldbus; Epson RC+ Options Platform permits for seamless growth of third-party options, benefiting builders and finish customers
  • Elevated consumer interplay with out sacrificing productiveness: SafeSense consists of commonplace and superior security options; with a correct threat evaluation, it may assist improve productiveness and employee safety whereas minimizing machine footprint as a result of discount of bodily boundaries
  • Minimized the workcell house requirement with an optimized robotic footprint: The new sequence has a number of mount and cable exit choices; 250, 300, or 350 mm (9.8, 11.8, or 13.7 in.) attain obtainable for the GX4C; 450, 550, or 650 mm (17.7, 21.6, or 25.5 in.) attain obtainable for the GX8C; 650 or 850 mm (25.5 or 33.4 in.) attain obtainable for the GX10C; 850 or 1,000 mm (33.4 or 39.3 in.) attain obtainable for the GX20C
  • Constructed for demanding environments: Commonplace, Cleanroom (ISO 3)2 and electrostatic discharge (ESD), protected IP65, and standalone ESD fashions can be found
  • Service and assist: With 4 a long time of automation expertise, Epson stated it supplies automation assist by way of its devoted utility and repair groups

The GX4C, GX8C, GX10C and GX20C are actually obtainable by way of Epson Robots’ channel of distributor companions.

Epson's GX-C six-axis robot, shown here, is designed to work with its RC+ development software.

The GX-C robotic is designed to work with the RC+ improvement software program. Supply: Epson Robots

RC800A controller comes with SafeSense know-how

Epson Robots stated its RC800A robotic controller is constructed to carry out with industrial-rated security options whereas optimizing runtime, restoration, and error dealing with. It gives elevated energy and power financial savings in contrast with earlier fashions and supplies safety-rated velocity and place monitoring.

When paired with correct threat evaluation, this superior know-how allows human-robot interplay with out a security fence, the corporate claimed. By eliminating the necessity for bodily security boundaries, the RC800A might help scale back each the footprint and general price of robotic automation programs.

As well as, the controller can improve productiveness by enabling quicker machine interplay for day by day operations and upkeep. The controller additionally consists of the Epson RC+ software program to streamline program improvement and simplify automation duties.

Epson to exhibit at Automate

Epson plans to reveal its programs at Sales space 2818 Automate 2025 on Might 12 to fifteen in Detroit. Attendees can see the brand new GX-C Collection, and companions will present the corporate’s excessive efficiency SCARA, six-axis, and different automated programs.

“We’re showcasing how Epson Robots mix ease of use, highly effective efficiency and superior options that assist secure, dynamic interplay between individuals and automation,” stated Scott Marsic, group product supervisor at Epson America.

“Via interactive demos and real-world purposes, attendees will see how our options make high-performance automation extra accessible,” he added. “Our lately launched SafeSense know-how additionally demonstrates how customers can function safely and effectively alongside industrial robots throughout a variety of industries and duties.”

Attendees may have alternatives to attend academic displays and seek the advice of with Epson’s professional utility and repair groups. They will additionally discover each open-base robots (CX12XL, LS20B, and N6) and enclosed programs (GX8 and VT6L with IF380) that spotlight their versatility throughout industries. The workflows will embody scanning of enormous elements, motion of heavy payloads, half transfers between slots in a standing cubicle, a stacking problem, and a pick-and-place demo.

Periods will embody “Past the Labor: Understanding Robotic ROI from Manufacturing to Income” on Wednesday, Might 14 from 8:00 to eight:45 a.m. EDT in Room 320 and “The place Will the Latest Breakthrough in Robots Come From? The place Have We Been and The place Are We Headed?” on Might 14 from 1:30 to 2:15 p.m. in Room 321.

Key companions on the present will embody Lanco Built-in (Sales space 1309), Addtronics (Sales space 2210), Common Machine and Engineering Corp. (Sales space 5604), and igus Inc. (Sales space 2613). They may present purposes together with inspection, pick-and-place operations, meeting, and movement plastics.

It is Not About What AI Can Do for Us, However What We Can Do for AI

0


Most view synthetic intelligence (AI) via a one-way lens. The know-how solely exists to serve people and obtain new ranges of effectivity, accuracy, and productiveness. However what if we’re lacking half of the equation? And what if, by doing so, we’re solely amplifying the know-how’s flaws?

AI is in its infancy and nonetheless faces vital limitations in reasoning, information high quality, and understanding ideas like belief, worth, and incentives. The divide between present capabilities and true “intelligence” is substantial. The excellent news? We are able to change this by changing into energetic collaborators quite than passive shoppers of AI.

People maintain the important thing to clever evolution by offering higher reasoning frameworks, feeding high quality information, and bridging the belief hole. Consequently, man and machine can work side-by-side for a win-win – with higher collaboration producing higher information and higher outcomes.

Let’s take into account what a extra symbiotic relationship might seem like and the way, as companions, significant collaboration can profit either side of the AI equation.

The required relationship between man and machine

AI is undoubtedly nice at analyzing huge datasets and automating complicated duties. Nonetheless, the know-how stays basically restricted in pondering like us. First, these fashions and platforms battle with reasoning past their coaching information. Sample recognition and statistical prediction pose no drawback however the contextual judgment and logical frameworks we take without any consideration are more difficult to copy. This reasoning hole means AI typically falters when confronted with nuanced situations or moral judgment.

Second, there’s “rubbish in, rubbish out” information high quality. Present fashions are educated on huge troves of data with and with out consent. Unverified or biased info is used no matter correct attribution or authorization, leading to unverified or biased AI. The “information food plan” of fashions is subsequently questionable at greatest and scattershot at worst. It’s useful to think about this influence in dietary phrases. If people solely eat junk meals, we’re gradual and sluggish. If brokers solely eat copyright and second-hand materials, their efficiency is equally hampered with output that’s inaccurate, unreliable, and normal quite than particular. That is nonetheless far off the autonomous and proactive decision-making promised within the coming wave of brokers.

Critically, AI continues to be blind to who and what it’s interacting with. It can’t distinguish between aligned and misaligned customers, struggles to confirm relationships, and fails to grasp ideas like belief, worth change, and stakeholder incentives – core components that govern human interactions.

AI issues with human options

We have to consider AI platforms, instruments, and brokers much less as servants and extra as assistants that we can assist prepare. For starters, let’s have a look at reasoning. We are able to introduce new logical frameworks, moral pointers, and strategic pondering that AI programs can’t develop alone. Via considerate prompting and cautious supervision, we are able to complement AI’s statistical strengths with human knowledge – educating them to acknowledge patterns and perceive the contexts that make these patterns significant.

Likewise, quite than permitting AI to coach on no matter info it might scrape from the web, people can curate higher-quality datasets which might be verified, numerous, and ethically sourced.

This implies growing higher attribution programs the place content material creators are acknowledged and compensated for his or her contributions to coaching.

Rising frameworks make this doable. By uniting on-line identities beneath one banner and deciding whether or not and what they’re comfy sharing, customers can equip fashions with zero-party info that respects privateness, consent, and rules. Higher but, by monitoring this info on the blockchain, customers and modelmakers can see the place info comes from and adequately compensate creators for offering this “new oil.” That is how we acknowledge customers for his or her information and produce them in on the knowledge revolution.

Lastly, bridging the belief hole means arming fashions with human values and attitudes. This implies designing mechanisms that acknowledge stakeholders, confirm relationships, and differentiate between aligned and misaligned customers. Consequently, we assist AI perceive its operational context – who advantages from its actions, what contributes to its improvement, and the way worth flows via the programs it participates in.

For instance, brokers backed by blockchain infrastructure are fairly good at this. They will acknowledge and prioritize customers with demonstrated ecosystem buy-in via repute, social affect, or token possession. This permits AI to align incentives by giving extra weight to stakeholders with pores and skin within the recreation, creating governance programs the place verified supporters take part in decision-making primarily based on their degree of engagement. Consequently, AI extra deeply understands its ecosystem and may make choices knowledgeable by real stakeholder relationships.

Don’t lose sight of the human ingredient in AI

Loads has been stated concerning the rise of this know-how and the way it threatens to overtake industries and wipe out jobs. Nonetheless, baking in guardrails can be certain that AI augments quite than overrides the human expertise. For instance, probably the most profitable AI implementations don’t substitute people however prolong what we are able to accomplish collectively. When AI handles routine evaluation and people present inventive path and moral oversight, either side contribute their distinctive strengths.

When carried out proper, AI guarantees to enhance the standard and effectivity of numerous human processes. However when carried out incorrect, it’s restricted by questionable information sources and solely mimics intelligence quite than displaying precise intelligence. It’s as much as us, the human facet of the equation, to make these fashions smarter and be certain that our values, judgment, and ethics stay at their coronary heart.

Belief is non-negotiable for this know-how to go mainstream. When customers can confirm the place their information goes, see the way it’s used, and take part within the worth it creates, they turn out to be keen companions quite than reluctant topics. Equally, when AI programs can leverage aligned stakeholders and clear information pipelines, they turn out to be extra reliable. In flip, they’re extra more likely to achieve entry to our most vital non-public {and professional} areas, making a flywheel of higher information entry and improved outcomes.

So, heading into this subsequent part of AI, let’s deal with connecting man and machine with verifiable relationships, high quality information sources, and exact programs. We must always ask not what AI can do for us however what we are able to do for AI.

ios – AVFoundation: The best way to detect macro mode utilizing AVCaptureDevice?


I want to copy native Digicam.app behaviour, particularly the macro mode.
I am utilizing digital units like .builtInTripleCamera or .builtInDualWideCamera and they’re able to enter macro mode with this code:

if machine.activePrimaryConstituentDeviceSwitchingBehavior != .unsupported {
    machine.setPrimaryConstituentDeviceSwitchingBehavior(.auto, restrictedSwitchingBehaviorConditions: [])
}

That works nice, however now I must show in my UI that digital camera has entered macro mode. And that is the place issues get troublesome: macro mode is principally simply an ultra-wide digital camera that’s zoomed in (or cropped), proper?
We will observe when a digital machine adjustments its lively major constituent machine utilizing KVO, like this:

let activeConstituentObserver = machine.observe(.activePrimaryConstituent, choices: [.new]) { [weak self] machine, change in
    guard let self = self else { return }
    guard let activePrimaryConstituentDevice = change.newValue ?? machine.activePrimaryConstituent else { return }
    let isMacroEnabled = activePrimaryConstituentDevice.deviceType == .builtInUltraWideCamera
    DispatchQueue.most important.async {
        self.cameraView.macroButton.isHidden = !isMacroEnabled
    }
}

However this observer will get known as each time an lively constituent machine adjustments, i.e everytime ultra-wide digital camera turns into lively (when person zooms out to 0.5x). AVCaptureDevice does not have “isInMacroMode” property, so what will we do? Properly, since macro mode is only a cropped ultra-wide digital camera, then we in all probability want to take a look at videoZoomFactor property, proper? However each units have their videoZoomFactors set to their default values (0.5x and 1x from person’s perspective)

let activeConstituentObserver = machine.observe(.activePrimaryConstituent, choices: [.new]) { [weak self] machine, change in
    guard let self = self else { return }
    guard let activePrimaryConstituentDevice = change.newValue ?? machine.activePrimaryConstituent else { return }
    let isMacroEnabled = activePrimaryConstituentDevice.deviceType == .builtInUltraWideCamera
    print("activePrimaryConstituentDevice zoom: ", activePrimaryConstituentDevice.videoZoomFactor) // 1.0 (aka 0.5x)
    print("virtualDevice zoom: ", machine.videoZoomFactor) // 2.0 (aka 1x)
    DispatchQueue.most important.async {
        self.cameraView.macroButton.isHidden = !isMacroEnabled
    }
}

So my query is: how do I detect that digital machine entered macro mode (not simply modified its digital camera to ultra-wide)?