Home Blog Page 3825

Risk Actors Exploit Microsoft Sway to Host QR Code Phishing Campaigns


A brand new report from cybersecurity firm Netskope reveals particulars about assault campaigns abusing Microsoft Sway and CloudFlare Turnstile and leveraging QR codes to trick customers into offering their Microsoft Workplace credentials to the phishing platform.

These campaigns have focused victims in Asia and North America throughout a number of segments led by know-how, manufacturing, and finance.

What’s quishing?

QR codes are a handy option to browse web sites or entry data with out the necessity to enter any URL on a smartphone. However there’s a danger in utilizing QR codes: cybercriminals would possibly abuse them to steer victims to malicious content material.

This course of, known as “quishing,” entails redirecting victims to malicious web sites or prompting them to obtain dangerous content material by scanning a QR code. As soon as on the positioning, cybercriminals work to steal your private and monetary data. The design of QR codes makes it not possible for the consumer to know the place the code will direct them after scanning.

Thomas Damonneville, head of anti-phishing firm StalkPhish, informed TechRepublic that quishing “is a rising development” that “could be very simple to make use of and makes it tougher to verify if the content material is respectable.”

Quishing assaults through Microsoft Sway

In July 2024, Netskope Risk Labs found a 2000-fold improve in site visitors to phishing pages through Microsoft Sway. The vast majority of the malicious pages used QR codes.

Unique Microsoft Sway phishing page.
Distinctive Microsoft Sway phishing web page. Picture: Netskope

Microsoft Sway is a web based app from Microsoft Workplace that comes free and allows customers to simply create shows or different web-based content material. The app being freed from cost makes it a sexy goal for cybercriminals.

Within the assault campaigns uncovered by Netskope’s researcher Jan Michael Alcantara, victims are being focused with Microsoft Sway pages that result in phishing makes an attempt for Microsoft Workplace credentials.

Another example of Sway page containing malicious QR code leading to phishing URL.
An instance of Sway web page containing malicious QR code resulting in phishing URL. Picture: Netskope

Netskope’s analysis doesn’t point out how the fraudulent hyperlinks have been despatched to victims. Nevertheless, it’s doable to unfold these hyperlinks through electronic mail, social networks, SMS, or immediate messaging software program.

The ultimate payload appears to be like just like the respectable Microsoft Workplace login web page, as uncovered in a Could 2024 publication from the identical researcher.

Final payload shows a fake Microsoft Office login page.
Remaining payload reveals a faux Microsoft Workplace login web page. Picture: Netskope

Stealthier assault utilizing CloudFlare Turnstile

CloudFlare’s Turnstile is a free software that replaces captchas, which have been exploited in reported assault campaigns. This respectable service permits web site homeowners to simply add the mandatory Turnstile code to their content material, enabling customers to easily click on on a verification code as a substitute of fixing a captcha.

CloudFlare Turnstile snippet.
CloudFlare Turnstile snippet. Picture: CloudFlare

From an attacker perspective, utilizing this free software is interesting as a result of it requires customers to click on on a CloudFlare Turnstile earlier than being redirected to the phishing web page. This provides a layer of safety towards detection for the attacker, as the ultimate phishing payload is hid from on-line URL scanners.

Attacker-in-the-middle phishing method

Conventional phishing strategies sometimes accumulate credentials earlier than displaying an error web page or redirecting the consumer to the respectable login web page. This method makes customers imagine they’ve entered incorrect credentials, doubtless leaving them unaware of the fraud.

The attacker-in-the-middle phishing method is extra discreet. The consumer’s credentials are collected and instantly used to log into the respectable service. This methodology, additionally known as clear phishing, permits the consumer to be efficiently logged after the fraudulent credential theft, making the assault much less noticeable.

Malicious QR code detection difficulties

“No person can learn a QR code along with his personal eyes,” Damonneville stated. “You may solely scan it with the suitable machine, a smartphone. Some hyperlinks will be so lengthy that you could’t verify the entire hyperlink, should you verify it … However who checks hyperlinks?”

Textual content-only-based detections are additionally ineffective towards QR codes as they’re photos. There may be additionally no widespread commonplace for verifying the authenticity of a QR code. Safety mechanisms akin to digital signatures for QR codes aren’t generally applied, making it troublesome to confirm the supply or integrity of the content material.

How are you going to stop a QR code from phishing?

Many QR code readers present a preview of the URL, although, enabling customers to see the URL earlier than scanning it. Any suspicion on the URL ought to entice the consumer to not use the QR code. Moreover:

  • QR codes resulting in actions akin to login or present data ought to increase suspicion and must be rigorously analyzed.
  • Safety options additionally would possibly assist, as they’ll detect phishing URLs. URLs ought to at all times be scanned by such a software.
  • Funds shouldn’t be carried out by QR code except you’re assured that it’s respectable.

Microsoft Sway is just not the one respectable product that may be used by cybercriminals to host phishing pages.

“We recurrently observe respectable websites or purposes getting used to host quishing or phishing, together with Github, Gitbooks or Google Docs, for instance, each day,” Damonneville stated. “To not point out all of the URL shorteners in the marketplace, or free internet hosting websites, extensively used to cover a URL simply.”

This as soon as once more enforces the concept that customers’ consciousness must be raised and workers have to be skilled to tell apart a suspicious URL from a respectable one.

Disclosure: I work for Pattern Micro, however the views expressed on this article are mine.

6 Onerous Issues Scaling Vector Search

0


You’ve determined to make use of vector search in your software, product, or enterprise. You’ve achieved the analysis on how and why embeddings and vector search make an issue solvable or can allow new options. You’ve dipped your toes into the recent, rising space of approximate nearest neighbor algorithms and vector databases.

Nearly instantly upon productionizing vector search purposes, you’ll begin to run into very exhausting and doubtlessly unanticipated difficulties. This weblog makes an attempt to arm you with some information of your future, the issues you’ll face, and questions you could not know but that you want to ask.

1. Vector search ≠ vector database

Vector search and all of the related intelligent algorithms are the central intelligence of any system making an attempt to leverage vectors. Nonetheless, all the related infrastructure to make it maximally helpful and manufacturing prepared is big and really, very simple to underestimate.

To place this as strongly as I can: a production-ready vector database will remedy many, many extra “database” issues than “vector” issues. Not at all is vector search, itself, an “simple” drawback (and we’ll cowl most of the exhausting sub-problems beneath), however the mountain of conventional database issues {that a} vector database wants to unravel definitely stay the “exhausting half.”

Databases remedy a number of very actual and really properly studied issues from atomicity and transactions, consistency, efficiency and question optimization, sturdiness, backups, entry management, multi-tenancy, scaling and sharding and way more. Vector databases would require solutions in all of those dimensions for any product, enterprise or enterprise.

Be very cautious of homerolled “vector-search infra.” It’s not that exhausting to obtain a state-of-the-art vector search library and begin approximate nearest neighboring your approach in direction of an attention-grabbing prototype. Persevering with down this path, nonetheless, is a path to accidently reinventing your individual database. That’s most likely a alternative you wish to make consciously.

2. Incremental indexing of vectors

Because of the nature of essentially the most trendy ANN vector search algorithms, incrementally updating a vector index is an enormous problem. This can be a well-known “exhausting drawback”. The difficulty right here is that these indexes are rigorously organized for quick lookups and any try and incrementally replace them with new vectors will quickly deteriorate the quick lookup properties. As such, as a way to preserve quick lookups as vectors are added, these indexes should be periodically rebuilt from scratch.

Any software hoping to stream new vectors constantly, with necessities that each the vectors present up within the index rapidly and the queries stay quick, will want critical help for the “incremental indexing” drawback. This can be a very essential space so that you can perceive about your database and a great place to ask a lot of exhausting questions.

There are a lot of potential approaches {that a} database would possibly take to assist remedy this drawback for you. A correct survey of those approaches would fill many weblog posts of this dimension. It’s necessary to know among the technical particulars of your database’s strategy as a result of it might have surprising tradeoffs or penalties in your software. For instance, if a database chooses to do a full-reindex with some frequency, it might trigger excessive CPU load and subsequently periodically have an effect on question latencies.

It’s best to perceive your purposes want for incremental indexing, and the capabilities of the system you’re counting on to serve you.

3. Knowledge latency for each vectors and metadata

Each software ought to perceive its want and tolerance for knowledge latency. Vector-based indexes have, at the least by different database requirements, comparatively excessive indexing prices. There’s a vital tradeoff between price and knowledge latency.

How lengthy after you ‘create’ a vector do you want it to be searchable in your index? If it’s quickly, vector latency is a serious design level in these programs.

The identical applies to the metadata of your system. As a normal rule, mutating metadata is pretty frequent (e.g. change whether or not a consumer is on-line or not), and so it’s sometimes essential that metadata filtered queries quickly react to updates to metadata. Taking the above instance, it’s not helpful in case your vector search returns a question for somebody who has lately gone offline!

If you want to stream vectors constantly to the system, or replace the metadata of these vectors constantly, you’ll require a unique underlying database structure than if it’s acceptable on your use case to e.g. rebuild the total index each night for use the subsequent day.

4. Metadata filtering

I’ll strongly state this level: I feel in nearly all circumstances, the product expertise might be higher if the underlying vector search infrastructure may be augmented by metadata filtering (or hybrid search).

Present me all of the eating places I would like (a vector search) which are positioned inside 10 miles and are low to medium priced (metadata filter).

The second a part of this question is a conventional sql-like WHERE clause intersected with, within the first half, a vector search consequence. Due to the character of those giant, comparatively static, comparatively monolithic vector indexes, it’s very tough to do joint vector + metadata search effectively. That is one other of the well-known “exhausting issues” that vector databases want to deal with in your behalf.

There are a lot of technical approaches that databases would possibly take to unravel this drawback for you. You possibly can “pre-filter” which implies to use the filter first, after which do a vector lookup. This strategy suffers from not with the ability to successfully leverage the pre-built vector index. You possibly can “post-filter” the outcomes after you’ve achieved a full vector search. This works nice except your filter could be very selective, wherein case, you spend large quantities of time discovering vectors you later toss out as a result of they don’t meet the required standards. Generally, as is the case in Rockset, you are able to do “single-stage” filtering which is to try to merge the metadata filtering stage with the vector lookup stage in a approach that preserves the most effective of each worlds.

In case you imagine that metadata filtering might be important to your software (and I posit above that it’ll nearly all the time be), the metadata filtering tradeoffs and performance will develop into one thing you wish to look at very rigorously.

5. Metadata question language

If I’m proper, and metadata filtering is essential to the appliance you might be constructing, congratulations, you may have yet one more drawback. You want a solution to specify filters over this metadata. This can be a question language.

Coming from a database angle, and as it is a Rockset weblog, you possibly can most likely anticipate the place I’m going with this. SQL is the business normal solution to categorical these sorts of statements. “Metadata filters” in vector language is just “the WHERE clause” to a conventional database. It has the benefit of additionally being comparatively simple to port between completely different programs.

Moreover, these filters are queries, and queries may be optimized. The sophistication of the question optimizer can have a big impact on the efficiency of your queries. For instance, refined optimizers will attempt to apply essentially the most selective of the metadata filters first as a result of this may reduce the work later phases of the filtering require, leading to a big efficiency win.

In case you plan on writing non-trivial purposes utilizing vector search and metadata filters, it’s necessary to know and be comfy with the query-language, each ergonomics and implementation, you might be signing up to make use of, write, and preserve.

6. Vector lifecycle administration

Alright, you’ve made it this far. You’ve received a vector database that has all the appropriate database fundamentals you require, has the appropriate incremental indexing technique on your use case, has a great story round your metadata filtering wants, and can preserve its index up-to-date with latencies you possibly can tolerate. Superior.

Your ML group (or possibly OpenAI) comes out with a brand new model of their embedding mannequin. You have got a huge database full of outdated vectors that now should be up to date. Now what? The place are you going to run this massive batch-ML job? How are you going to retailer the intermediate outcomes? How are you going to do the swap over to the brand new model? How do you intend to do that in a approach that doesn’t have an effect on your manufacturing workload?

Ask the Onerous Questions

Vector search is a quickly rising space, and we’re seeing loads of customers beginning to convey purposes to manufacturing. My aim for this submit was to arm you with among the essential exhausting questions you won’t but know to ask. And also you’ll profit enormously from having them answered sooner quite than later.

On this submit what I didn’t cowl was how Rockset has and is working to unravel all of those issues and why a few of our options to those are ground-breaking and higher than most different makes an attempt on the cutting-edge. Masking that might require many weblog posts of this dimension, which is, I feel, exactly what we’ll do. Keep tuned for extra.



selenium webdriver – Unable to run take a look at on gitlab pipeline because of DevToolsActivePort file does not exist


I created a pattern Selenium Webdriver take a look at utilizing java and maven and pushed it out to my gitlab repository. Then I created a new pipeline to run the automated checks. That is my first time operating a take a look at on the pipeline so I adopted some directions on-line. The difficulty I am having is that the take a look at fails due to this error: (unknown error: DevToolsActivePort file does not exist)

I noticed some few questions concerning this too and tried so as to add these options however nonetheless getting that error. Used this hyperlink the place I added these arguments.
WebDriverException: unknown error: DevToolsActivePort file does not exist whereas attempting to provoke Chrome Browser

Undecided what I have to do to repair this error or run the take a look at as a non-root consumer which might perhaps be a workaround.

That is my yml file for the pipeline

# calling the docker picture the place chrome, maven, jdk can be found to run the checks
picture: markhobson/maven-chrome:jdk-11

# constructing the maven
construct:
  stage: construct
  script:
    - mvn compile

# operating the checks
take a look at:
  stage: take a look at
  script:
    - mvn clear take a look at 

BaseTest class

@BeforeSuite
    public void beforeSuite()  {
        WebDriverManager.chromedriver().setup();
        ChromeOptions choices = new ChromeOptions();
        String runTime = null;
        
        attempt {
            InputStream enter = BaseTest.class.getClassLoader().getResourceAsStream("runSetup.properties");
            Properties properties = new Properties();
            properties.load(enter);
            runTime = properties.getProperty("runTimeOnLocal");
        } catch (IOException e) {
            e.printStackTrace();
        }
        
        // if it is true, meaning operating domestically on my machine and open the webrowser
        // if it is false, meaning it is operating on gitlab headless        
        if(runTime.equalsIgnoreCase("TRUE")) {
            choices.addArguments("start-maximized"); 
            choices.addArguments("enable-automation"); 
            choices.addArguments("--no-sandbox"); 
            choices.addArguments("--disable-infobars");
            choices.addArguments("--disable-dev-shm-usage");
            choices.addArguments("--disable-browser-side-navigation"); 
            choices.addArguments("--disable-gpu"); 
        }
        else if(runTime.equalsIgnoreCase("FALSE")) {
            choices.addArguments("--disable-dev-shm-usage");
            choices.addArguments("--no-sandbox"); 
            choices.addArguments("--disable-gpu"); 
            choices.setHeadless(true);
        }
        
        driver.set(new ChromeDriver(choices));
        Log.information("Opening internet utility");
        driver.get().get("https://demo.opencart.com/");
    }

I additionally put the hyperlink to my repo as it is a pattern for me to assessment selenium (want to modify to grasp department)

SeleniumReview

A Few Safety Applied sciences, a Massive Distinction in Premiums


When the BlackCat ransomware gang compromised healthcare-billing providers agency Change Healthcare in February, a number of safety controls failed: The corporate didn’t adequately shield its Citrix remote-access portal, didn’t require workers to make use of multifactor authentication (MFA), and did not implement a sturdy backup technique.

The subsidiary of UnitedHealth additionally had no cyber insurance coverage, which means its father or mother firm needed to foot the invoice, at the least $872 million, and — in hindsight, maybe simply as vital — missed the good thing about a cyber insurer’s deal with what methods can reduce claims. Each insurers and “insursec” corporations, which mix insurance coverage and safety providers, are awash in information on the present risk panorama and the applied sciences that seem to take advantage of distinction — amongst them, backups, MFA, and defending remote-access programs.

Discovering the fitting safety applied sciences for the enterprise is more and more vital, as a result of ransomware incidents have accelerated over the previous few years, says Jason Rebholz, CISO at Corvus Insurance coverage, a cyber insurer. Attackers posted the names of at the least 1,248 victims to leak websites within the second quarter of 2024, the best quarterly quantity up to now, in accordance the agency.

“Indubitably, assaults are growing by way of frequency and severity — the info is pointing to that,” he says. “We additionally see that once you deal with particular safety controls, you possibly can have a significant impression on each stopping these incidents, but in addition in simply recovering from the incident [with fewer costs].”

Cyber insurance coverage has grow to be a safety finest apply, with the overwhelming majority of security-mature corporations (84%) retaining a cyber-insurance coverage whereas one other 9% are within the technique of acquiring a coverage, based on a latest survey of 400 safety determination makers by insursec agency At-Bay and analyst agency Omdia, a sister firm to Darkish Studying. Total, 72% of all corporations take into account cyber insurance coverage to be essential or vital to their group, the survey discovered.

Three (or 5) Defenses Each Firm Wants

Greater than 60% of insurance coverage claims contain a ransomware incident, whereas email-based fraud accounts for one more 20% of claims, based on At-Bay. As a result of most profitable assaults use weak or misconfigured remote-access factors or compromise a person system by e-mail, enhancing safety on these two vectors is paramount, says Roman Itskovich, chief danger officer and co-founder at At-Bay.

The insurer expenses much less to clients who use e-mail programs with higher safety, akin to Google Workspace, and extra for on-premise e-mail programs, as a result of Google customers have filed fewer claims. The insursec agency additionally discovered that corporations who use self-managed digital non-public networks have a 3.7 instances higher probability of submitting a ransomware declare.

“We take VPNs very severely in how we worth [our policies] and what suggestions we give to our corporations … and that is largely associated to ransomware,” says Itskovich.

For these causes, companies ought to check out their VPN safety and e-mail safety, in the event that they wish to higher safe their environments and, by extension, cut back their coverage prices. As a result of an attacker will finally discover a option to compromise most corporations, having a option to detect and reply to threats is vitally vital, making managed detection and response (MDR) one other expertise that may finally pay for itself, he says.

“How do you catch somebody who simply made the beachhead earlier than they entry your database, or earlier than you get to your accounting system?” Itskovich says. “For that, we discover that EDRs are very, very efficient — extra particularly, EDRs which might be managed.”

Backup, However Confirm

For smaller corporations, e-mail safety, cybersecurity-awareness coaching, and multi-factor authentication are essential, says Matthieu Chan Tsin, vp of cybersecurity providers for Cowbell. As well as, safe information storage will help get an organization again up and operating shortly, minimizing the enterprise impression of a ransomware assault, he says.

“We have a look at encryption and the way we assist our policyholders higher retailer the info,” Tsin says. “Having good backups, having some cloud backups, some in-house backups [are critical], as a result of that is actually the one factor that may get them again to enterprise as shortly as doable.”

Firms with sturdy backups are about 2.4 instances much less prone to must pay a ransom, based on Corvus Insurance coverage. The cyber insurer recommends a “3-2-1 coverage,” the place the enterprise makes three totally different backups to at the least two several types of media, with at the least one backup saved offsite. The corporate discovered that coverage holders with sturdy backup methods claimed 72% decrease damages than companies who didn’t preserve sturdy backups, based on its Q2 2024 Cyber Risk Report.

The technique is efficient sufficient that attackers have moved to double-ransom strategies, the place they not solely encrypt information to make it unusable, but in addition steal the info to extort the enterprise. In 2024, practically all ransomware incidents (93%) concerned information theft, a pointy improve from 2022 when lower than half of incidents concerned information theft.

“Backups can have a fairly significant impression as a type of line of final protection, if you’re getting getting attacked through ransomware,” Corvus’ Rebholz says.

The Darkish Horse: Disruption Threat From Third Events

Attackers additionally appear to be centered on compromising aggregators — these third-party corporations have some kind of privileged entry to a number of different corporations: Corporations akin to network-monitoring service SolarWinds, healthcare billing supplier Change Healthcare, and auto dealership providers agency CDK International. Within the second quarter of 2024, third-party breach occasions accounted for about 40% of all claims processed, up from 20% within the final quarter of 2023, based on Corvus.

“We name out IT providers as one of many industries which might be getting hit, and that is a kind of causes — it is simply type of a one-to-many [relationship], proper?” Corvus’s Rebholz says. “What we will see from this yr — particularly, the primary half of the yr — is there are some huge names on the market that have been third events that received hit, and we will see a subsequent improve within the frequency due to that.”

Main harmful assaults, akin to WannaCry and SolarWinds, can result in important prices for cyber insurers, and in some methods are analogous to pure catastrophes. Nonetheless, figuring out the fitting danger scores for such occasions is tougher, as a result of the causes — and likelihood of prevalence — are removed from easy, says At-Bay’s Itskovich.

“[SolarWinds] was a risk actor delivering malicious software program by the replace mechanism; CrowdStrike was a software program error within the replace; CDK International was was a ransomware assault on the corporate; WannaCry was a widespread vulnerability,” he says. “For those who [think about] pure catastrophes, you cope with hurricanes and earthquakes and perhaps a pair different secondary perils — it is a lot less complicated.”



Demystifying AI within the Water Business | by Davar Ardalan | Aug, 2024


Members and organizers of the TriCon AI Workshop: (L-R) Travis Wagner (Trinnex), Alana Gildner (BV), Yudu (Sonia) Wu (WSP), Madeleine Driscoll (Hazen and Sawyer), Craig Daley (Metropolis of Baltimore), John Smith (Haley Ward), Brian Ball (VA Engineering), David Gisborn (DC Water), and Davar Ardalan (TulipAI). Brandon O’Daniel of Xylem, one of many audio system, was not current within the photograph

Water trade professionals explored the intersection of synthetic intelligence (AI) and machine studying (ML) throughout a pre-conference workshop in Ocean Metropolis, Maryland yesterday, discovering that whereas AI’s roots return to 1948, at the moment’s Generative AI has the potential to fully upend their trade.

Designed to make AI applied sciences accessible and related, the classes emphasised the vital function of information and the significance of information governance, sparking pleasure and curiosity amongst members — all main as much as the Chesapeake Tri-Affiliation Convention (TriCon), the water trade’s premier occasion.

Craig Daly, Chief of Water Services Division, Metropolis of Baltimore DPW on the basics of AI and ML

Professionals from the Metropolis of Rockville, WSSC, Metropolis of Baltimore, DC Water, and regional engineering corporations gathered to discover how AI will be successfully utilized to their discipline. Offered by the CWEA and CSAWWA Asset Administration Committee, the session featured Craig Daly from the Metropolis of Baltimore, Travis Wagner of Trinnex, Brandon O’Daniel of Xylem, John Smith of Haley Ward and Davar Ardalan of TulipAI.

The workshop targeted on sensible, actionable steps, displaying members how these instruments can improve accuracy, save time, and optimize their water methods. Breakout classes additionally launched members to the real-world functions of generative AI instruments.

Travis Wagner, Vice-President at Trinnex introduced on Economics of AI in Remedy Processes

John Smith of Haley Ward and Davar Ardalan of TulipAI led a particular phase titled “Accountable AI Adventures: Innovating Environmental Engineering,” which highlighted the significance of moral concerns when utilizing AI within the water trade.

Smith and Ardalan launched the beta model of John Smith GPT, an AI assistant designed to help John and his staff of environmental engineers in duties like proposal writing, value estimating, and advertising methods. They emphasised two vital factors: first, by no means share proprietary data with an open AI software; and second, at all times be clear when utilizing AI, simply as you’ll with a bibliography or by naming your sources. This transparency is crucial for sustaining belief and integrity in how AI is built-in into skilled practices.

Strive the beta model of John Smith GPT right here. The customized AI:

Leverages a long time of civil engineering information from veteran civil engineer John Oliver Smith.

Supplies data on supplies related to grant initiatives, enhancing proposal element.

Provides information on eco-friendly supplies and strategies, supporting sustainability goals.

John Smith additionally underscored the significance of not sharing confidential data with AI methods. He likened AI to a strong software that, like every other, must be used responsibly. Furthermore, he inspired attendees to pilot AI instruments with their groups earlier than full-scale implementation, permitting for collaborative enter and refinement of the expertise to go well with particular wants. Their message was clear: AI can rework the trade, however it should be used thoughtfully and with full consciousness of its moral implications.

As AI instruments proceed to develop, classes like this at TriCon are important for staying knowledgeable and ready. They equip water professionals with the instruments and understanding they should harness new applied sciences successfully and responsibly.

This content material was crafted with the help of synthetic intelligence, which contributed to structuring the narrative, making certain grammatical accuracy, summarizing key factors, and enhancing the readability and coherence of the fabric.

Associated Story: