6.3 C
New York
Friday, April 11, 2025
Home Blog Page 3855

Carnegie Robotics acquires Duro positioning product line from Swift Navigation

0


Hearken to this text

Voiced by Amazon Polly
Carnegie Robotics acquires Duro positioning product line from Swift Navigation

Carnegie Robotics plans the identical degree of help the Duro GNSS ruggedized receiver as Swift Navigation gave. Supply: Carnegie Robotics

Carnegie Robotics, which focuses on engineering and commercializing autonomous functions, immediately introduced that it has acquired the Duro product line from Swift Navigation Inc. Initially launched in 2017 as a collaboration between Swift and Carnegie Robotics, the Duro line contains high-precision GNSS receivers and software program for correct positioning in difficult environments.

“We’re excited to totally combine the Duro product line into our product portfolio,” said Mike Embrescia, chief improvement officer of Carnegie Robotics. “We sit up for additional enhancing the product to ship even higher worth, and to proceed supporting our key markets in marine, building, mining, agriculture, rail, and navy.”

Based in 2010, Carnegie Robotics has labored with the U.S. Military and DARPA, co-created Uber’s Superior Know-how Group (later offered to Aurora Innovation), and developed an autonomous floor-cleaning robotic with Nilfisk. The Pittsburgh-based firm has additionally developed a number of stereo cameras, the QAC-2 rugged laptop for cellular indoor robots, the Cardshark military-grade wearable gadget, and the Stallion autonomous off-road automobile.


SITE AD for the 2024 RoboBusiness registration now open.
Register now.


Carnegie Robotics helps off-road autonomy

“Since 2017, we’ve centered on off-road autonomy,” defined Embrescia. “First, we assist autonomize a factor. We additionally design and construct merchandise that energy autonomy — cameras, pose filters, and the localization {hardware} and software program stack.”

“We design, check, produce, and calibrate, all on-site,” he instructed The Robotic Report. “We’re an ISO 9001-certified facility, guaranteeing high quality.”

Carnegie Robotics stated it has surpassed 5 million hours of autonomous robots in use. The Pittsburgh-based firm stated its acquisition of Duro will increase its mental property and advance the security and effectivity of autonomous and semi-autonomous equipment.

Exact positioning utilizing international navigation satellite tv for pc system (GNSS) know-how “is significant for guaranteeing the correct operation of heavy tools, notably in difficult environments,” stated Carnegie Robotics. It added that the mixing of Duro’s receivers into its current product line will allow centimeter-level accuracy.

Carnegie Robotics stated it is going to proceed the standard of merchandise, service, and help that prospects count on from the Duro product line.

Swift Navigation shifts to software program as a service

Carnegie Robotics and Swift Navigation stated they’ve labored collectively to reply to prospects’ wants in growing the Duro product line. In addition they stated the acquisition will help Swift’s shift to a software-only service mannequin with {hardware} integrations.

Duro integrates with Swift’s Skyark Exact Positioning Service, a cloud-based service that it claimed enhances GNSS accuracy by as much as 100x, supporting SAE Stage 3 autonomous automobiles.

“Swift is dedicated to advancing location-based merchandise that enhance security and effectivity throughout industries,” stated Brad Sherrard, govt vice chairman and basic supervisor, industrial, at Swift Navigation.

“Carnegie Robotics’ acquisition of the Duro line underscores the standard of the merchandise and the success of our collaboration,” he added. “We sit up for persevering with our strategic partnership with Carnegie Robotics.”

Based in 2012, Swift Navigation stated its positioning programs allow exact mapping, monitoring, and navigation for autonomous automobiles, industrial automation, outside robots, drones, logistics functions, and cellular improvements. Final month, the San Francisco-based firm stated its know-how has supported greater than 10 million superior driver-assist programs (ADAS) worldwide.

Swift Navigation has tested its Skylark GNSS positioning system with the RLM Rover test platform.

Swift has examined its Skylark positioning system for robotic mowers with the RLM Rover platform. Supply: Swift Navigation

How is AI Remodeling Drug Discovery? – NanoApps Medical – Official web site


Environmental case for vertical farming stacks up, claims research



Environmental case for vertical farming stacks up, claims research

Rising lettuce on stacked cabinets in high-tech greenhouses may very well be pretty much as good for the surroundings as rising them in fields and may save 8,000 hectares of land within the UK, in response to a brand new research from the College of Surrey and the College of Aberdeen.  

Researchers studied a vertical lettuce farm within the UK. They discovered it produced the equal of 740g of carbon dioxide (CO2) per kilo of lettuce. This was corresponding to rising in a subject – however used quite a bit much less land.  

Dr Zoe M Harris, co-author of the research and a Senior Lecturer at Surrey’s Centre for Surroundings and Sustainability, stated:  

vertical farms might help cut back the local weather impression of farming, particularly if their electrical energy comes from renewable sources.  

“Vertical farming makes use of about 28 occasions much less land than conventional farming strategies. If all lettuce fields have been changed with vertical farms, we may save  

That would unlock land to develop different crops. Vertical farms will also be inbuilt cities, considerably lowering the impression of transporting the crop to the individuals who eat them. 

“Our research is a vital first step in the direction of demonstrating the impacts of vertical farming being greener than first thought, regardless of solely having a restricted data vary out there.”

In vertical farms, cabinets of crops like lettuce or herbs are stacked on prime of one another in a managed surroundings. Crops can develop with out soil – drip-fed with nutrient-rich water and even with mist sprayed onto their uncovered roots. 

Within the lettuce farm studied, electricity use made up almost 40% of its complete local weather change impression. As such, the local weather impression of vertical farming relies upon quite a bit on how that electrical energy is generated.  

The researchers additionally studied different environmental impacts like land use, water use, and water air pollution.  

Michael Gargaro, a researcher at Surrey’s Centre for Surroundings and Sustainability, stated:  

“One of many largest environmental impacts got here from the jute plugs the lettuce seeds are grown in. They made up 18% of the local weather change impact, in addition to the lion’s share of the water air pollution and land use too.  

Utilizing one other materials may make a vertical farm much more sustainable. Future analysis ought to take into account options like coconut fibre, hemp or perlite. 

“We hope this research conjures up additional analysis into the sustainability of the meals sector. 

The hyperlink to the research will be discovered right here.

Siri Is Cooking for WWDC 2024


For years, Siri felt extra like a halfhearted try at a digital assistant than a very useful AI companion. Affected by struggles to grasp context and combine with third-party apps, Apple’s iconic assistant appeared prone to be left behind as rivals like Alexa and Google Assistant continued at a speedy tempo.

That each one modifications with iOS 18, iPadOS 18, and macOS Sequoia. Apple has given Siri an enormous shot of intelligence with the introduction of two key elements: the App Intents framework and Apple Intelligence. This highly effective mixture transforms Siri from a parlor trick right into a deeply built-in, context-aware assistant able to tapping into the information fashions and performance of your favourite apps.

On the coronary heart of this reinvention is the App Intents framework, an API that enables builders to outline “assistant schemas” — fashions that describe particular app actions and knowledge varieties. By constructing with these schemas, apps can specific their capabilities in a language that Apple’s newest AI fashions can deeply comprehend.

App Intents are simply the entry level. The true magic comes from Apple Intelligence, a model new system introduced at this 12 months’s WWDC that infuses superior generative AI instantly into Apple’s core working techniques. Combining App Intents with this new AI engine provides Siri the flexibility to intelligently function on apps’ structured knowledge fashions, perceive pure language in context, make clever ideas, and even generate content material — all whereas defending person’s privateness.

As an example the potential, this text explores how this might play out within the kitchen by imagining a hypothetical cooking app known as Chef Cooks. This app adopts a number of of Apple’s new assistant schemas.

Knowledge Modeling With App Entities

Earlier than Siri can perceive the cooking area, the cooking app should outline its knowledge entities so Apple Intelligence can comprehend them. That is carried out by creating customized structs conforming to the @AssistantEntity schema macros:

@AssistantEntity(schema: .cookbook.recipe)
struct RecipeEntity: IndexedEntity {
  let id: String
  let recipe: Recipe

  @Property(title: "Title") 
  var title: String 
    
  @Property(title: "Description") 
  var description: String? 

  @Property(title: "Delicacies") 
  var delicacies: CuisineType? 
  var substances: [IngredientEntity] 
  var directions: [InstructionEntity] 

  var displayRepresentation: DisplayRepresentation { 
    DisplayRepresentation(title: title, 
      subtitle: delicacies?.displayRepresentation) 
  } 
} 

@AssistantEntity(schema: .cookbook.ingredient) 
struct IngredientEntity: ObjectEntity { 
  let id = UUID() 
  let ingredient: Ingredient @Property(title: "Ingredient") 
  var title: String @Property(title: "Title") 
  var quantity: String? 
    
  var displayRepresentation: DisplayRepresentation { 
    DisplayRepresentation(title: title, subtitle: quantity) 
  } 
}

Adopting the .cookbook.recipe and .cookbook.ingredient schemas ensures the app’s recipes and ingredient knowledge fashions adhere to the specs that Apple Intelligence expects for the cooking area. Observe the person of the @Property property wrappers to outline titles for key attributes. With the information groundwork laid, the app can begin defining particular app intents that function this knowledge utilizing the @AssistantIntent macro.

Discovering Recipes

One of many core experiences in a cooking app is trying to find recipes. The cooking app can allow this for Siri utilizing the .cookbook.findRecipes schema.

@AssistantIntent(schema: .cookbook.findRecipes)
struct FindRecipesIntent: FindIntent {
  @Property(title: "Search Question")
  var searchQuery: String?
 
  @Dependency
  var recipeStore: RecipeStore

  @MainActor
  func carry out() async throws -> some ReturnsValue<[RecipeEntity]> {
    let outcomes = attempt await recipeStore.findRecipes(matching: searchQuery)
    return .outcome(outcomes)
  }
}

This intent accepts a searchQuery parameter and makes use of the app’s RecipeStore to seek out matching recipes from the database. Siri may then combine this app performance in quite a lot of clever methods. For instance:

“Hey Siri, discover vegetarian recipes within the Chef Cooks app.”

*Siri shows a listing of matching veggie recipes.*

Crucially, Siri can perceive the area context and even make ideas with out the person explicitly naming the app.

Viewing Recipe Particulars

With the flexibility to seek out recipes, customers seemingly will need to view the total particulars of a specific dish. The cooking app can assist this by adopting the .cookbook.openRecipe schema:

@AssistantIntent(schema: .cookbook.openRecipe)
struct OpenRecipeIntent: OpenIntent {
  var goal: RecipeEntity

  @Dependency
  var navigation: NavigationManager

  @MainActor
  func carry out() async throws -> some IntentResult {
    navigation.openRecipe(goal.recipe)
    return .outcome()
  }
}

This intent merely accepts a RecipeEntity and instructs the apps’ NavigationManager to open the corresponding full recipe element view. It permits experiences like:

“Hey Siri, present me the recipe for rooster Parmesan.”

  • App opens to the rooster Parmesan recipe.
  • The person sees an appetizing picture of Margherita pizza in Siri ideas.

“Open that recipe in Chef Cooks.”

  • App launches on to the pizza recipe.

However the place Apple Intelligence and App Intents actually shine is in additional superior clever experiences …

Clever Meal Planning

By modeling its knowledge utilizing assistant schemas, Chef Cooks can faucet into Apple Intelligence’s highly effective language mannequin to allow seamless, multi-part queries:

“Hey Siri, I need to make rooster enchiladas for dinner this week.”

Somewhat than simply trying to find and opening a rooster enchilada recipe, Siri understands the total context of this request. It first searches Chef Cooks’s knowledge for an appropriate enchilada recipe, then:

  1. Checks whether or not all substances are in inventory primarily based on the person’s semantic understanding of their kitchen stock.
  2. Provides any lacking substances to a grocery record.
  3. Provides the recipe to a brand new meal plan for the upcoming week.
  4. Supplies a time estimate for prepping and cooking the meal.

All of this occurs with out leaving the conversational Siri interface, because of the app adopting further schemas like .shoppingList.addItems and .mealPlanner.createPlan. App Intents open the door to extremely clever, multifaceted app experiences by which Siri acts as a real collaboration assistant, understanding your intent and orchestrating a number of actions throughout numerous knowledge fashions.

Interactive Widgets With WidgetKit

In fact, not each interplay should occur by voice. Chef Cooks can use its App Intents implementation to energy clever interactive widgets as effectively utilizing WidgetKit.

One instance of utilizing interactive widgets is integrating Chef Cooks’ .cookbook.findRecipe intent utilizing the Safari Internet Widget to offer a centered recipe search expertise with out leaving the browser:

struct RecipeSearchEntry: TimelineEntry {
  let date = Date()
  var searchQuery = ""

  @OpenInAppIntent(schema: .cookbook.findRecipes)   
  var findRecipesIntent: FindRecipesIntent? {
    FindRecipesIntent(searchQuery: searchQuery)
  }
}

This widget entry combines the @OpenInAppIntent property wrapper with Chef Cooks’ FindRecipeIntent implementation to permit customers to enter a search question and immediately view filtered recipe outcomes — all within the Internet Widget UI. Chef Cooks may even assemble extra superior WidgetKit experiences by combining a number of intents into wealthy, interactive widgets that drive customized flows similar to planning a meal by first discovering recipes after which including substances to a grocery record, or displaying complementary recipes and instruction movies primarily based on previous cooking periods.

With App Intents offering the structured knowledge modeling, WidgetKit can rework these clever interactions into immersive, ambient experiences throughout Apple’s platforms.

SBOM – A Software To Reverse Engineer And Examine The RPM And APT Databases To Listing All The Packages Alongside With Executables, Service And Variations

0




SBOM – A Software To Reverse Engineer And Examine The RPM And APT Databases To Listing All The Packages Alongside With Executables, Service And Variations

It is a easy SBOM utility which goals to supply an insider view on which packages are getting executed.

The method and goal is easy we will get a transparent perspective view on the packages put in by APT (at the moment engaged on implementing this for RPM and different bundle managers). That is primarily wanted to verify which all packages are literally being executed.

Set up

The packages wanted are talked about within the necessities.txt file and could be put in utilizing pip:

pip3 set up -r necessities.txt

Utilization

  • Initially set up the packages.
  • Secondly , that you must arrange surroundings variables comparable to:
    • Mount the picture: At present I’m nonetheless engaged on a mechanism to routinely outline a mount level and mount several types of photos and volumes however its nonetheless fairly a process for me.
  • Lastly run the software to checklist all of the packages.
Argument Description
--analysis-mode Specifies the mode of operation. Default is static. Selections are static and chroot.
--static-type Specifies the kind of evaluation for static mode. Required for static mode solely. Selections are information and service.
--volume-path Specifies the trail to the mounted quantity. Default is /mnt.
--save-file Specifies the output file for JSON output.
--info-graphic Specifies whether or not to generate visible plots for CHROOT evaluation. Default is True.
--pkg-mgr Manually specify the bundle supervisor or dont add this feature for computerized verify.
APT:
Static Data Evaluation:
– This command runs this system in static evaluation mode, particularly utilizing the Data Listing evaluation methodology.
– It analyzes the packages put in on the mounted quantity positioned at /mnt.
– It saves the output in a JSON file named output.json.
– It generates visible plots for CHROOT evaluation.
```bash
python3 principal.py --pkg-mgr apt --analysis-mode static --static-type information --volume-path /mnt --save-file output.json
```
  • Static Service Evaluation:

  • This command runs this system in static evaluation mode, particularly utilizing the Service file evaluation methodology.

  • It analyzes the packages put in on the mounted quantity positioned at /custom_mount.
  • It saves the output in a JSON file named output.json.
  • It doesn’t generate visible plots for CHROOT evaluation. bash python3 principal.py --pkg-mgr apt --analysis-mode static --static-type service --volume-path /custom_mount --save-file output.json --info-graphic False

  • Chroot evaluation with or with out Graphic output:

  • This command runs this system in chroot evaluation mode.
  • It analyzes the packages put in on the mounted quantity positioned at /mnt.
  • It saves the output in a JSON file named output.json.
  • It generates visible plots for CHROOT evaluation.
  • For graphical output maintain --info-graphic as True else False bash python3 principal.py --pkg-mgr apt --analysis-mode chroot --volume-path /mnt --save-file output.json --info-graphic True/False

RPMStatic Evaluation: – Much like how its accomplished on apt however there is just one sort of static scan avaialable for now. bash python3 principal.py --pkg-mgr rpm --analysis-mode static --volume-path /mnt --save-file output.json

  • Chroot evaluation with or with out Graphic output:
  • Precisely how its accomplished on apt. bash python3 principal.py --pkg-mgr rpm --analysis-mode chroot --volume-path /mnt --save-file output.json --info-graphic True/False

Supporting Photos

At present the software works on Debian and Purple Hat primarily based photos I can guarentee the debian outputs however the Purple-Hat onces nonetheless wants work to be accomplished its not good.

I’m engaged on the pacman facet of issues I’m looking for a relaiable approach of accessing the pacman db for static evaluation.

Graphical Output Photos (Chroot)

APT Chroot

RPM Chroot

SBOM – A Software To Reverse Engineer And Examine The RPM And APT Databases To Listing All The Packages Alongside With Executables, Service And Variations

Internal Workings

For the workings and course of associated documentation please learn the wiki web page: Hyperlink

TODO

  • [x] Assist for RPM
  • [x] Assist for APT
  • [x] Assist for Chroot Evaluation
  • [x] Assist for Variations
  • [x] Assist for Chroot Graphical output
  • [x] Assist for organized graphical output
  • [ ] Assist for Pacman

Concepts and Discussions

Concepts concerning this subject are welcome within the discussions web page.