Home Blog

Mars Might Be Hiding an Ocean of Liquid Water Beneath Its Floor

0


Proof is mounting {that a} secret lies beneath the dusty crimson plains of Mars, one that would redefine our view of the crimson planet: an unlimited reservoir of liquid water, locked deep within the crust.

Mars is roofed in traces of historic our bodies of water. However the puzzle of precisely the place all of it went when the planet turned chilly and dry has lengthy intrigued scientists.

Our new research might supply a solution. Utilizing seismic knowledge from NASA’s InSight mission, we uncovered proof that the seismic waves decelerate in a layer between 5.4 and eight kilometers under the floor, which might be due to the presence of liquid water at these depths.

The Thriller of the Lacking Water

Mars wasn’t all the time the barren desert we see as we speak. Billions of years in the past, through the Noachian and Hesperian durations (4.1 billion to three billion years in the past), rivers carved valleys and lakes shimmered.

As Mars’ magnetic area pale and its ambiance thinned, most floor water vanished. Some escaped to area, some froze in polar caps, and a few was trapped in minerals, the place it stays as we speak.

Graphic showing Mars covered in diminishing amounts of water at times from 4 billion years ago to today.

4 billion years in the past (high left), Mars might have hosted an enormous ocean. However the floor water has slowly disappeared, leaving solely frozen remnants close to the poles as we speak. Picture Credit score: NASA

However evaporation, freezing, and rocks can’t fairly account for all of the water that will need to have coated Mars within the distant previous. Calculations counsel the “lacking” water is sufficient to cowl the planet in an ocean at the least 700 meters deep, and maybe as much as 900 meters deep.

One speculation has been that the lacking water seeped into the crust. Mars was closely bombarded by meteorites through the Noachian interval, which can have shaped fractures that channelled water underground.

Deep beneath the floor, hotter temperatures would hold the water in a liquid state—not like the frozen layers nearer the floor.

A Seismic Snapshot of Mars’ Crust

In 2018, NASA’s InSight lander touched down on Mars to take heed to the planet’s inside with a super-sensitive seismometer.

By learning a selected sort of vibration referred to as “shear waves,” we discovered a big underground anomaly: a layer between 5.4 and eight kilometers down the place these vibrations transfer extra slowly.

This “low-velocity layer” is more than likely extremely porous rock crammed with liquid water, like a saturated sponge. One thing like Earth’s aquifers, the place groundwater seeps into rock pores.

We calculated the “aquifer layer” on Mars might maintain sufficient water to cowl the planet in a world ocean 520–780 meters deep—a number of instances as a lot water as is held in Antarctica’s ice sheet.

This quantity is suitable with estimates of Mars’ “lacking” water (710–920 meters), after accounting for losses to area, water sure in minerals, and fashionable ice caps.

Meteorites and Marsquakes

We made our discovery thanks to 2 meteorite impacts in 2021 (named S1000a and S1094b) and a marsquake in 2022 (dubbed S1222a). These occasions despatched seismic waves rippling via the crust, like dropping a stone right into a pond and watching the waves unfold.

A satellite photo of a crater in red ground.

The crater attributable to meteorite affect S1094b, as seen from NASA’s Mars Reconnaissance Orbiter. Picture Credit score: NASA/JPL-Caltech/College of Arizona

InSight’s seismometer captured these vibrations. We used the high-frequency indicators from the occasions—consider tuning right into a crisp, high-definition radio station—to map the crust’s hidden layers.

We calculated “receiver features,” that are signatures of those waves as they bounce and reverberate between layers within the crust, like echoes mapping a cave. These signatures allow us to pinpoint boundaries the place rock adjustments, revealing the water-soaked layer 5.4 to eight kilometers deep.

Why It Issues

Liquid water is important for all times as we all know it. On Earth, microbes thrive in deep, water-filled rock.

May comparable life, maybe relics of historic Martian ecosystems, persist in these reservoirs? There’s just one approach to discover out.

The water could also be a lifeline for extra complicated organisms, too—akin to future human explorers. Purified, it might present ingesting water, oxygen, or gas for rockets.

After all, drilling kilometers deep on a distant planet is a frightening problem. Nonetheless, our knowledge, collected close to Mars’ equator, additionally hints at the opportunity of different water-rich zones—such because the icy mud reservoir of Utopia Planitia.

What’s Subsequent for Mars Exploration?

Our seismic knowledge covers solely a slice of Mars. New missions with seismometers are wanted to map potential water layers throughout the remainder of the planet.

Future rovers or drills might sooner or later faucet these reservoirs, analyzing their chemistry for traces of life. These water zones additionally require safety from Earthly microbes, as they may harbor native Martian biology.

For now, the water invitations us to maintain listening to Mars’ seismic heartbeat, decoding the secrets and techniques of a world maybe extra like Earth than we thought.

This text is republished from The Dialog beneath a Artistic Commons license. Learn the authentic article.

A New Frontier for Community Engineers


Whenever you first hear about MCP — Mannequin Context Protocol, it appears like one thing constructed for hardcore AI researchers. However right here’s the fact: Community engineers and automation engineers are going to be a few of the greatest customers of it.

In case you’re questioning why: MCP is the way you make Massive Language Fashions (LLMs) perceive your community, your topology, your requirements, your world.

With out it? You’re simply getting generic ChatGPT solutions.

With it? You’re creating Agentic AI that may configure, troubleshoot, and design networks with you.

I’ve been speaking to you — You! …Sure, you! — about community automation and adopting automation in your community engineering for years now. All in all, it’s time so as to add one other brick in *your* wall (of tech instruments). On this AI Break, we’ll discover an instance that demonstrates the worth of utilizing MCP to grasp automation in right this moment’s AI world.

Okay, so what’s MCP?

At its coronary heart, Mannequin Context Protocol is about injecting structured data into an LLM at runtime — mechanically and programmatically.

As an alternative of manually pasting community diagrams or config templates right into a chat window, MCP lets your instruments inform the mannequin:

  • What gadgets are on the community
  • What requirements you utilize
  • What applied sciences you like (OSPF over EIGRP, EVPN over VXLAN, no matter)
  • What change management processes exist

All that context flows into the mannequin, making its responses smarter, extra aligned, and extra helpful to your surroundings.

Let’s begin with a fundamental, real-world instance

Let’s say you’re constructing an LLM-based Community Assistant that helps generate configs. You don’t need it suggesting RIP when your whole community runs OSPF and BGP.

With MCP, earlier than you even ask the mannequin for a config, you present AI with the next context:

Look acquainted? Yup, it’s a JSON.

{
  "network_standards": {
    "routing_protocols": ["OSPF", "BGP"],
    "preferred_encapsulation": "VXLAN",
    "security_policies": {
      "ssh_required": true,
      "telnet_disabled": true
    }
  },
  "topology": {
    "core_devices": ["core-sw1", "core-sw2"],
    "edge_devices": ["edge-fw1", "edge-fw2"],
    "site_layout": "hub and spoke"
  }
}

Your assistant mechanically sends this context to the LLM utilizing MCP, and then asks, “Generate a config to onboard a brand new web site.”

The mannequin now solutions in a means that matches your surroundings— not some random textbook response.

So, what expertise do it’s essential to use MCP?

Truthfully, a variety of you have already got most of what’s wanted:

  • API Fundamentals. You’ll be sending structured context (often JSON) over API calls — identical to RESTCONF, NETCONF, Catalyst Heart, Or Meraki APIs.
  • Understanding your community metadata. It’s essential know what issues: routing, VLANs, safety, system varieties, and how one can symbolize that as structured knowledge.
  • Python scripting. You’ll in all probability use Python to gather this information dynamically (like through Nornir, Netmiko, or native APIs) after which bundle it into MCP calls.
  • LLM fundamentals. It’s essential perceive how prompts and context home windows work, and the way larger context equals smarter outputs.

The underside line

MCP isn’t some “perhaps later” factor for networkers.

It’s turning into the bridge between your real-world community data and AI’s capability that can assist you sooner, higher, and extra precisely.

Engineers who know how one can feed actual context into LLMs will dominate community design, troubleshooting, safety auditing, and even full-stack automation.

Begin now 

  • Map your community requirements.
  • Package deal them as JSON.
  • Play with sending that context into small AI workflows.

The very best AI Brokers are constructed by engineers who know their community—and know how one can train it to their AI. Subsequent, let’s get hands-on with MCP!

Strive it

For a completely working code and directions to get began, take a look at my undertaking on GitHub.

Create a actual Mannequin Context Protocol (MCP) server designed for community engineers.

This MCP app does the next:

  • Serve your community requirements (routing protocols, safety insurance policies, and so forth.)
  • Reply with system well being
  • Hook up with Claude Desktop, making your AI assistant conscious of your actual community surroundings

And it’s so simple as:

  1. Import the MCP Python SDK
    from mcp.server.fastmcp import FastMCP
  2. Initialize the FastMCP server with a singular title
    mcp = FastMCP("network-assistant")
  3. Outline instruments.
    Instruments are a robust primitive within the Mannequin Context Protocol (MCP). They let your server expose actual actions—so the mannequin can question programs, run logic, or kick off workflows. In our use case, we have to outline ‘network-standards’ & ‘system standing’ features:
    @mcp.device()
    async def get_network_standards() -> dict[str, Any]:
        """Returns customary routing protocols, encapsulation, and safety insurance policies."""
    return NETWORK_STANDARDS
  4. Run the server, and you might be set!
    if __name__ == "__main__":
        mcp.run(transport="stdio")
    

And if we have a look at it, that is what the LLM is aware of about your community earlier than you contextualized it:

 

And that is after connecting the LLM to our Community:

The place community automation and AI really collide

You’re not scripting for the sake of scripting. And also you don’t simply use AI for the sake of buzzwords. When you may mix stay community state with LLM intelligence, you’re constructing programs that suppose, adapt, and help with you—not simply for you.

Begin easy. Construct one stream.
Make your AI agent really know your community. As a result of the longer term belongs to engineers who don’t simply automate—they contextualize.

Welcome to the brand new frontier of Agentic AI!

Get began with AI

Studying Paths, programs, free tutorials, and extra. Unlock the way forward for know-how with synthetic intelligence coaching in Cisco U. Discover AI studying and begin constructing your expertise right this moment.

Join Cisco U. | Be a part of the  Cisco Studying Community right this moment without spending a dime.

Observe Cisco Studying & Certifications

X | Threads | Fb | LinkedIn | Instagram | YouTube

Use  #CiscoU and #CiscoCert to hitch the dialog.

Adaptability: The Should-Have Talent for Community Engineers within the AI Period

MCP for DevOps, NetOps, and SecOps: Actual-World Use Circumstances and Future Insights

 

Share:



Evolving from Bots to Brainpower: The Ascendancy of Agentic AI

0


What really separates us from machines? Free will, creativity and intelligence? However give it some thought. Our brains aren’t singular, monolithic processors. The magic is not in a single “pondering half,” however quite in numerous specialised brokers—neurons—that synchronize completely. Some neurons catalog information, others course of logic or govern emotion, nonetheless extra retrieve reminiscences, orchestrate motion, or interpret visible alerts. Individually, they carry out easy duties, but collectively, they produce the complexity we name human intelligence.

Now, think about replicating this orchestration digitally. Conventional AI was all the time slender: specialised, remoted bots designed to automate mundane duties. However the new frontier is Agentic AI—programs constructed from specialised, autonomous brokers that work together, motive and cooperate, mirroring the interaction inside our brains. Giant language fashions (LLMs) type the linguistic neurons, extracting that means and context. Specialised process brokers execute distinct capabilities like retrieving knowledge, analyzing developments and even predicting outcomes. Emotion-like brokers gauge person sentiment, whereas decision-making brokers synthesize inputs and execute actions.

The result’s digital intelligence and company. However do we want machines to imitate human intelligence and autonomy?

Each area has a choke level—Agentic AI unblocks all of them

Ask the hospital chief who’s attempting to fill a rising roster of vacant roles. The World Well being Group predicts a international shortfall of 10 million healthcare employees by 2030. Medical doctors and nurses pull 16-hour shifts prefer it’s the norm. Claims processors grind by way of limitless coverage critiques, whereas lab technicians wade by way of a forest of paperwork earlier than they’ll even take a look at a single pattern. In a well-orchestrated Agentic AI world, these professionals get some reduction. Declare-processing bots can learn insurance policies, assess protection and even detect anomalies in minutes—duties that will usually take hours of mind-numbing, error-prone work. Lab automation brokers may obtain affected person knowledge straight from digital well being data, run preliminary assessments and auto-generate stories, releasing up technicians for the extra delicate duties that really want human talent.

The identical dynamic performs out throughout industries. Take banking, the place anti-money laundering (AML) and know-your-customer (KYC) processes stay the largest administrative complications. Company KYC calls for limitless verification steps, advanced cross-checks, and reams of paperwork. An agentic system can orchestrate real-time knowledge retrieval, conduct nuanced threat evaluation and streamline compliance in order that workers can concentrate on precise shopper relationships quite than wrestling with types.

Insurance coverage claims, telecom contract critiques, logistics scheduling—the record is limitless. Every area has repetitive duties that lavatory down proficient individuals.

Sure, agentic AI is the flashlight in a darkish basement: shining a brilliant mild on hidden inefficiencies, letting specialised brokers sort out the grunt work in parallel, and giving groups the bandwidth to concentrate on technique, innovation and constructing deeper connections with prospects.

However the true energy agentic AI lies in its capacity to resolve not only for effectivity or one division however to scale seamlessly throughout a number of capabilities—even a number of geographies. That is an enchancment of 100x scale.

  • Scalability: Agentic AI is modular at its core, permitting you to start out small—like a single FAQ chatbot—then seamlessly broaden. Want real-time order monitoring or predictive analytics later? Add an agent with out disrupting the remainder. Every agent handles a selected slice of labor, chopping improvement overhead and letting you deploy new capabilities with out ripping aside your present setup.
  • Anti-fragility: In a multi-agent system, one glitch gained’t topple every thing. If a diagnostic agent in healthcare goes offline, different brokers—like affected person data or scheduling—maintain working. Failures keep contained inside their respective brokers, guaranteeing steady service. Meaning your whole platform gained’t crash as a result of one piece wants a repair or an improve.
  • Adaptability: When rules or client expectations shift, you may modify or substitute particular person brokers—like a compliance bot—with out forcing a system-wide overhaul. This piecemeal strategy is akin to upgrading an app in your telephone quite than reinstalling the complete working system. The consequence? A future-proof framework that evolves alongside your online business, eliminating huge downtimes or dangerous reboots.

You’ll be able to’t predict the following AI craze, however you might be prepared for it

Generative AI was the breakout star a few years in the past; agentic AI is grabbing the highlight now. Tomorrow, one thing else will emerge—as a result of innovation by no means rests. How then, will we future-proof our structure so every wave of latest know-how doesn’t set off an IT apocalypse? In accordance with a current Forrester examine, 70% of leaders who invested over 100 million {dollars} in digital initiatives credit score one technique for fulfillment: a platform strategy.

As a substitute of ripping out and changing outdated infrastructure every time a brand new AI paradigm hits, a platform integrates these rising capabilities as specialised constructing blocks. When agentic AI arrives, you don’t toss your whole stack—you merely plug within the newest agent modules. This strategy means fewer venture overruns, faster deployments, and extra constant outcomes.

Even higher, a strong platform presents end-to-end visibility into every agent’s actions—so you may optimize prices and maintain a tighter grip on compute utilization. Low-code/no-code interfaces additionally decrease the entry barrier for enterprise customers to create and deploy brokers, whereas prebuilt instrument and agent libraries speed up cross-functional workflows, whether or not in HR, advertising and marketing, or another division. Platforms that help PolyAI architectures and a wide range of orchestration frameworks assist you to swap totally different fashions, handle prompts and layer new capabilities with out rewriting every thing from scratch. Being cloud-agnostic, additionally they eradicate vendor lock-in, letting you faucet the most effective AI providers from any supplier. In essence, a platform-based strategy is your key to orchestrating multi-agent reasoning at scale—with out drowning in technical debt or dropping agility.

So, what are the core parts of this platform strategy?

  1. Knowledge: Plugged into a typical layer
    Whether or not you’re implementing LLMs or agentic frameworks, your platform’s knowledge layer stays the cornerstone. If it’s unified, every new AI agent can faucet right into a curated data base with out messy retrofitting.
  2. Fashions: Swappable brains
    A versatile platform helps you to choose specialised fashions for every use case—monetary threat evaluation, customer support, healthcare diagnoses—then updates or replaces them with out nuking every thing else.
  3. Brokers: Modular workflows
    Brokers thrive as unbiased but orchestrated mini-services. In case you want a brand new advertising and marketing agent or a compliance agent, you spin it up alongside present ones, leaving the remainder of the system secure.
  4. Governance: Guardrails at scale
    When your governance construction is baked into the platform—protecting bias checks, audit trails, and regulatory compliance—you stay proactive, not reactive, no matter which AI “new child on the block” you undertake subsequent.

A platform strategy is your strategic hedge towards know-how’s ceaseless evolution—guaranteeing that regardless of which AI development takes heart stage, you’re able to combine, iterate, and innovate.

Begin small and orchestrate your approach up

Agentic AI isn’t completely new—Tesla’s self-driving automobiles employs a number of autonomous modules. The distinction is that new orchestration frameworks make such multi-agent intelligence extensively accessible. Now not confined to specialised {hardware} or industries, Agentic AI can now be utilized to every thing from finance to healthcare, fueling renewed mainstream curiosity and momentum.Design for platform-based readiness. Begin with a single agent addressing a concrete ache level and broaden iteratively. Deal with knowledge as a strategic asset, choose your fashions methodically, and bake in clear governance. That approach, every new AI wave integrates seamlessly into your present infrastructure—boosting agility with out fixed overhauls.

BayLISA and Cisco DevNet – Tech meetups in particular person and on-line


What occurs when the sysadmin neighborhood meets a contemporary developer motion? BayLISA and DevNet, that’s what😎!

What’s BayLISA?

BayLISA stands for Bay Space Massive Set up System Directors and is a consumer group of system and community directors from the San Francisco Bay Space. The consumer group was based within the very early Nineties after the fourth Massive Set up System Administration (LISA) convention. BayLISA meets month-to-month to debate subjects of curiosity to directors and managers of websites supporting greater than 100 customers and/or computer systems. The concept was to supply a discussion board for Sysadmin professionals within the San Francisco Bay space to get collectively and alternate concepts, hear audio system handle subjects of curiosity and most significantly, socialize. The conferences are free and open to the general public. BayLISA has supported and educated methods, community, storage, virtualization, and different expertise professionals within the Bay Space for over 30 years. You could find extra particulars in regards to the consumer group at https://baylisa.org and https://www.meetup.com/baylisa.

BayLISA meets Cisco DevNet

Alright, the place does Cisco DevNet come into the image? The story of the collaboration between BayLISA and Cisco DevNet begins within the second half of 2023. As individuals had been beginning to return to in-person occasions, I had the privilege of being a speaker on the DeveloperWeek CloudX 2023 convention in San Mateo, California. There I met Ron Pagani, one of many BayLISA organizers. After a few 12 months hiatus from operating the month-to-month meetups, Ron was taking a look at getting again to in-person occasions. At Cisco DevNet we had been taking a look at rising our members neighborhood at the moment so Ron and I assumed that we may collaborate in a means that’s useful for everybody. This collaboration would imply at first discovering area on the Cisco campus in San Jose the place we may run the month-to-month meetup. As a observe as much as that, we had been additionally capable of counsel audio system and be members on the BayLISA board.

We’ve had a profitable 2024, operating a number of meetups on the Cisco Campus in San Jose however for this 12 months we determined to vary location. For the foreseeable future we’ll run the meetup at 3350 Scott Blvd Constructing 54, Santa Clara, CA 95054 within the Ding Ding Studio. You need to double verify the situation of the meetup at https://www.meetup.com/baylisa simply in case and likewise please RSVP at this hyperlink in order that we all know that you simply plan on becoming a member of us in order that we are able to higher manage the logistics for the occasion. In addition to subjects of curiosity and improbable individuals that you would be able to work together and chat with, there’s additionally pizza and refreshments for everybody. We additionally attempt to assist individuals which might be in search of jobs or which have positions opened on their groups or of their firm by providing an area to make these vital connections and sharing info.

Stay streaming

Whereas individuals residing within the Bay Space are greater than welcome to hitch our month-to-month meetup, we all know not everybody can attend in particular person. We additionally wish to expose the data shared in our meetup to a bigger on-line viewers. So we’ve determined to reside stream the BayLISA month-to-month meetups on the Cisco DevNet social media platforms, together with YouTube, LinkedIn, X/Twitter, Twitch and Fb.

The meetup takes place on the third Thursday of the month, so be sure to save that in your calendars. We’ll even have the reside streaming periods scheduled prematurely so as to see what the subject of the session can be in addition to audio system and timing. Because the periods are recorded, you can even view the recordings on-line. For YouTube, I’ve created a playlist at this hyperlink: https://www.youtube.com/playlist?listing=PL2k86RlAekM_r-_3uDdU4P2o4kLRF6VnB. You may already discover there the recording of the session for the month of April. We’ve had Madhupriya Ravishankar and Tianhong Zhang from Oracle presenting Converged Database: JSON Meets Relational for the Trendy Period.

Whether or not you’re a Bay Space native or tuning in from afar, observe DevNet in your favourite platform to remain plugged into methods and community administration.

Search for us on Social for the most recent on NetOps, DevOps, SecOps, Automation, and AI:

DevNet on YouTube
DevNet on LinkedIn
DevNet on X (Twitter)
DevNet on BlueSky
DevNet on Twitch
DevNet on Instagram
DevNet on Fb
DevNet on TikTok

Share:



Cisco joins AI infrastructure alliance



“The addition of Cisco reinforces AIP’s dedication to an open-architecture platform and fostering a broad ecosystem that helps a various vary of companions on a non-exclusive foundation, all working collectively to construct a brand new type of AI infrastructure,” the group stated in a assertion

Individually, Cisco introduced AI initiatives centered within the Center East area. Final week, Cisco CEO Chuck Robbins visited Saudi Arabia, UAE, Qatar, and Bahrain. This week, Jeetu Patel, govt vice chairman and chief product officer, is in Saudi Arabia, the place he’s taking part in President Trump’s state go to to the area, in accordance with Cisco. Associated new tasks embrace: 

  • An initiative with HUMAIN, Saudi Arabia’s new AI enterprise to assist construct an open, scalable, resilient and cost-efficient AI infrastructure: “This landmark collaboration will set a brand new customary for a way AI infrastructure is designed, secured and delivered – combining Cisco’s world experience with the Kingdom’s daring AI ambitions. The multi-year initiative goals to place the nation as a world chief in digital innovation,” Cisco said.
  • A collaboration with the UAE-basedG42 to co-develop a safe AI portfolio and AI-native companies: Cisco and G42 will work collectively to evaluate the potential to co-develop and collectively deploy AI-powered cybersecurity packages, in addition to a reference structure that integrates Cisco’s networking, safety, and infrastructure options particularly designed for high-performance computing. This collaboration goals to assist prospects construct and safe AI-ready information facilities and develop AI workloads successfully, in accordance with the businesses.
  • Curiosity in Qatar’s digital transformation: Qatar’s Ministry of Inside and Cisco signed a letter of intent to collaborate on Qatar’s digital transformation, AI, infrastructure improvement and cybersecurity.