Tag Archives: blocking-ddos

DD4BC Shifts Focus to Businesses, Continues DDoS Attack

Cybercriminals and extortionists demanding Bitcoin as ransom is on the rise these days. Due to the easy of transfer and pseudonymity associated with Bitcoin transactions, it has become the currency of choice for them. We have been hearing about ransomware, hacking incidents where sensitive data is stolen from computers and even extortion by threatening to physically harm an individual, the only common factor in all these cases is the ransom, to be paid in Bitcoin. There is one such cybercriminal group called DD4BC who have made it a regular habit to launch Distributed Denial of Service (DDoS) attacks on the websites belonging to Scandinavian companies. Once they launch an initial DDoS attack, they will blackmail these companies to pay about 40 bitcoins to avoid further attacks on their IT infrastructure. In most cases, the group sends out emails to the targeted firm within hours of launching the first DDoS attack. These emails, demanding ransom in Bitcoins also promises the victims that it is a one-time thing and if they pay the ransom, DD4BC will not attack them again. DD4BC also claims in the mail that even though they do bad things, they are going to keep their word. It is surprising that the group which was targeting European banks and financial institutions all these days has suddenly shifted their target to businesses in Scandinavia. Recently DD4BC allegedly tried to extort money from Bitalo Bitcoin Exchange – 1 BTC in exchange for information on how to prevent DDoS attack. But the plan seemed to backfire when the CEO of the Exchange, Martin Albert announced a bounty of 100 BTC for information about the person/people behind DD4BC. Among the list of Bitcoin sites targeted by DD4BC includes CEX.io and Bitcoin sports book Nitrogen Sports. Recently an Australian company was hacked into by unidentified perpetrators. They allegedly stole sensitive data, asking for ransom. They have also threatened to harm family members of one of the top officials from that company. Source: http://www.livebitcoinnews.com/dd4bc-shifts-focus-to-businesses-continues-ddos-attack/

Read More:
DD4BC Shifts Focus to Businesses, Continues DDoS Attack

South Africa a target for DDoS

South Africa is the most targeted country in Africa when it comes to distributed denial-of-service (DDOS) attacks. This was revealed by Vernon Fryer, chief technology security officer at Vodacom, in a keynote address during ITWeb Security Summit 2015, in Midrand, this morning. In computing, a DDOS attack is an attempt to make a machine or network resource unavailable to its intended users. Such an attack generally consists of efforts to temporarily or indefinitely interrupt or suspend services of a host connected to the Internet. Fryer was speaking with reference to statistics from the Vodacom Cyber Intelligence Centre, which the company established eight years ago to analyse the threat landscape on the African continent. He revealed over the past 18 months, there has been a marked increase in DDOS attacks on the continent, with a typical attack averaging 9Gbps. “There has been about a 150% increase in the number of DDOS [attacks] in the last 18 months in Africa,” he said. In terms of the number of attacks, Kenya, Uganda, Algeria, Nigeria and Tanzania respectively come after SA, said Fryer, pointing to the analysis done by the Vodacom Cyber Intelligence Centre last Thursday. According to Fryer, the majority of in-bound traffic to SA emanated mainly from China, Germany, Brazil, Vietnam, Russia, Cyprus, Turkey, Switzerland, Canada and the US. However, he noted, it was surprising Switzerland and Canada were featuring on the list this year, something never witnessed previously. Another unexpected trend showed traffic coming from Swaziland, he added, pointing out the growing number of Chinese communities in the country could be a reason for this spike. Describing some of the attack vectors cyber criminals were making use of in the region, Fryer pointed to scareware, ransomware, fake anti-virus, as well as TDSS Rootkit, among others. The trending malware included KINS Trojan, Skypot, VirRansom, SpyEye Trojan and the Chameleon Botnet. With regard to ransomware attacks in Africa, Tanzania is the most attacked on the African continent, Fryer said. He also noted the trending hacker groups in Africa include Anonymous, also known as the Lizard Squad, the Syrian Electronic Army, as well as the Yemen Cyber Army. Faced with the rise in the level and sophistication of attacks, Fryer said organisations need to constantly monitor the behaviour of their firewalls. Typically, he said, organisations take about five years without monitoring their firewall. “We need to understand if our firewalls are capable of handling today’s threats. Thus, the performance of firewalls needs to be constantly monitored,” he concluded. Source: http://www.itweb.co.za/index.php?option=com_content&view=article&id=143446:SA-a-target-for-DDOS&catid=234

Taken from:
South Africa a target for DDoS

Community college targeted ongoing DDoS attack

Walla Walla Community College is under cyberattack this week by what are believed to be foreign computers that have jammed the college’s Internet systems. Bill Storms, technology director, described it as akin to having too many cars on a freeway, causing delays and disruption to those wanting to connect to the college’s website. The type of attack is a distributed denial of service, or DDoS. They’re often the result of hundreds or even thousands of computers outside the U.S. that are programed with viruses that continually connect to and overload targeted servers. Storms said bandwidth monitors noticed the first spike of attacks on Sunday. To stop the attacks, college officials have had to periodically shut down the Web connection while providing alternative working Internet links to students and staff. The fix, so far, has only been temporary as the problem often returns the next day. “We think we have it under control in the afternoon. And we have a quiet period,” Storm said. “And then around 9 a.m. it all comes in again.” Walla Walla Community College may not be the only victim of the DDoS attack. Storm said he was informed that as many as 39 other state agencies have been the target of similar DDoS attacks. As for the reason for the attack, none was given to college officials. Storms noted campus operators did receive a number of unusual phone calls where the callers said that they were in control of the Internet. But no demands were made. “Some bizarre phone calls came in, and I don’t know whether to take them serious or not,” Storms said. State officials have been contacted and are aiding the college with the problem. Storms said they have idea how long the DDoS attack will last. Source: http://union-bulletin.com/news/2015/apr/30/community-college-targeted-ongoing-cyberattack/

Continued here:
Community college targeted ongoing DDoS attack

Featured article: How to use a CDN properly and make your website faster

Its one of the biggest mysteries to me I have seen in my 15+ years of Internet hosting and cloud based services. The mystery is, why do people use a Content Delivery Network for their website yet never fully optimize their site to take advantage of the speed and volume capabilities of the CDN. Just because you use a CDN doesn’t mean your site is automatically faster or even able to take advantage of its ability to dish out mass amounts of content in the blink of an eye. At DOSarrest I have seen the same mystery continue, this is why I have put together this piece on using a CDN and hopefully help those who wish to take full advantage of a CDN. Most of this information is general and can be applied to using any CDN but I’ll also throw in some specifics that relate to DOSarrest. Some common misconceptions about using a CDN As soon as I’m configured to use a CDN my site will be faster and be able to handle a large amount of web visitors on demand. Website developers create websites that are already optimized and a CDN won’t really change much. There’s really nothing I can do to make my website run faster once its on a CDN. All CDN’s are pretty much the same. Here’s what I have to say about the misconceptions noted above In most cases the answer to this is…. NO !! If the CDN is not caching your content your site won’t be faster, in fact it will probably be a little slower, as every request will have to go from the visitor to the CDN which will in turn go and fetch it from your server then turn around and send the response back to the visitor. In my opinion and experience website developers in general do not optimize websites to use a CDN. In fact most websites don’t even take full advantage of a browsers’ caching capability. As the Internet has become ubiquitously faster, this fine art has been left by the wayside in most cases. Another reason I think this has happened is that websites are huge, complex and a lot of content is dynamically generated coupled with very fast servers with large amounts of memory. Why spend time on optimizing caching, when a fast server will overcome this overhead. Oh yes you can and that’s why I have written this piece…see below No they aren’t. Many CDN’s don’t want you know how things are really working from every node that they are broadcasting your content from. You have to go out and subscribe to a third party service, if you have to get a third party service, do it, it can be fairly expensive but well worth it. How else will you know how your site is performing from other geographic regions. A good CDN should let you know the following in real-time but many don’t. Number of connections/requests between the CDN and Visitors. Number of connections/requests between the CDN and your server (origin). You want try and have the number of requests to your server to be less than the number of requests from the CDN to your visitors. *Tip- Use HTTP 1.1 on both “a” & “b” above and try and extend the keep-alive time on the origin to CDN side Bandwidth between the CDN and Internet visitors Bandwidth between the CDN and your server (origin) *Tip – If bandwidth of “c” and “d” are about the same, news flash…You can make things better. Cache status of your content (how many requests are being served by the CDN) *Tip – This is the best metric to really know if you are using your CDN properly. Performance metrics from outside of the CDN but in the same geographic region *Tip- Once you have the performance metrics from several different geographic regions you can compare the differences once you are on a CDN, your site should load faster the further away the region is located from your origin server, if you’re caching properly. For the record DOSarrest provides all of the above in real-time and it’s these tools I’ll use to explain on how to take full advantage of any CDN but without any metrics there’s no scientific way to know you’re on the right track to making your site super fast. There are five main groups of cache control tags that will effect how and what is cached. Expires : When attempting to retrieve a resource a browser will usually check to see if it already has a copy available for reuse. If the expires date has past the browser will download the resource again. Cache-control : HTTP 1.1 this expands on the functionality offered by Expires. There are several options available for the cache control header: – Public : This resource is cacheable. In the absence of any contradicting directive this is assumed. – Private : This resource is cachable by the end user only. All intermediate caching devices will treat this resource as no-cache. – No-cache : Do not cache this resource. – No-store : Do not cache, Do not store the request, I was never here – we never spoke. Capiche? – Must-revalidate : Do not use stale copies of this resource. – Proxy-revalidate : The end user may use stale copies, but intermediate caches must revalidate. – Max-age : The length of time (in seconds) before a resource is considered stale. A response may include any combination of these headers, for example: private, max-age=3600, must-revalidate. X-Accel-Expires : This functions just like the Expires header, but is only intended for proxy services. This header is intended to be ignored by browsers, and when the response traverses a proxy this header should be stripped out. Set-Cookie : While not explicitly specifying a cache directive, cookies are generally designed to hold user and/or session specific information. Caching such resources would have a negative impact on the desired site functionality. Vary : Lists the headers that should determine distinct copies of the resource. Cache will need to keep a separate copy of this resource for each distinct set of values in the headers indicated by Vary. A Vary response of “ * “ indicates that each request is unique. Given that most websites in my opinion are not fully taking advantage of caching by a browser or a CDN, if you’re using one, there is still a way around this without reviewing and adjusting every cache control header on your website. Any CDN worth its cost as well as any cloud based DDoS protection services company should be able to override most website cache-control headers. For demonstration purposes we used our own live website DOSarrest.com and ran a traffic generator so as to stress the server a little along with our regular visitor traffic. This demonstration shows what’s going on, when passing through a CDN with respect to activity between the CDN and the Internet visitor and the CDN and the customers server on the back-end. At approximately 16:30 we enabled a feature on DOSarrest’s service we call “Forced Caching” What this does is override in other words ignore some of the origin servers cache control headers. These are the results: Notice that bandwidth between the CDN and the origin (second graph) have fallen by over 90%, this saves resources on the origin server and makes things faster for the visitor. This is the best graphic illustration to let you know that you’re on the right track. Cache hits go way up, not cached go down and Expired and misses are negligible. The graph below shows that the requests to the origin have dropped by 90% ,its telling you the CDN is doing the heavy lifting. Last but not least this is the fruit of your labor as seen by 8 sensors in 4 geographic regions from our Customer “ DEMS “ portal. The site is running 10 times faster in every location even under load !

Follow this link:
Featured article: How to use a CDN properly and make your website faster

Banks Lose Up to $100K/Hour to Shorter, More Intense DDoS Attacks

Distributed denial of service attacks have morphed from a nuisance to something more sinister. In a DDoS attack, heavy volumes of traffic are hurled at a website to halt normal activity or inflict damage, typically freezing up the site for several hours. Such exploits achieved notoriety in the fall of 2012 when large banks were hit by a cyberterrorist group. But the Operation Ababil attacks were simply meant to stop banks’ websites from functioning. They caused a great deal of consternation among bank customers and the press, but little serious harm. Since then, the attacks have become more nuanced and targeted, several recent reports show. “DDoS is a growing problem, the types of attack are getting more sophisticated, and the market is attracting new entrants,” said Rik Turner, a senior analyst at Ovum, a research and consulting firm. For example, “we’re seeing lots of small attacks with intervals that allow the attackers to determine how efficiently the victims’ mitigation infrastructure is and how quickly it is kicking in,” he said. This goes for banks as much as for nonbanking entities. Verisign’s report on DDoS attacks carried out in the fourth quarter of 2014 found that the number of attacks against the financial industry doubled to account for 15% of all offensives. DDoS activity historically increases during the holiday season each year. “Cybercriminals typically target financial institutions during the fourth quarter because it’s a peak revenue and customer interaction season,” said Ramakant Pandrangi, vice president of technology at Verisign. “As hackers have become more aware of this, we anticipate the financial industry will continue to see an increase in the number of DDoS activity during the holiday season year over year.” In a related trend, bank victims are getting hit repeatedly. “If you have an organization that’s getting hit multiple times, often that’s an indicator of a very targeted attack,” said Margee Abrams, director of security services at Neustar, an information services company. According to a report Neustar commissioned and released this week, in the financial services industry, 43% of bank targets were hit more than six times during 2014. Neustar worked with a survey sampling company that gathered responses from 510 IT directors in the financial services, retail and IT services, with strong representation in financial services. (The respondents are not Neustar customers.) The average bandwidth consumed by a DDoS attack increased to 7.39 gigabits per second, according to Verisign’s analysis of DDoS attacks in the fourth quarter of 2014. This is a 245% increase from the last quarter of 2013 and it’s larger than the incoming bandwidth most small and medium-sized businesses, such as community banks, can provision. At the same time, DDoS attacks are shorter, as banks have gotten relatively adept at handling them. Most (88%) detect attacks in less than two hours (versus 77% for companies in general), according to Neustar’s research. And 72% of banks respond to attacks in that timeframe. Some recent DDoS attacks on banks have been politically motivated. Last year, a hacker group called the European Cyber Army claimed responsibility for DDoS attacks against websites run by Bank of America, JPMorgan Chase, and Fidelity Bank. Little is known about the group, but it has aligned itself with Anonymous on some attacks and seems interested in undermining U.S. institutions, including the court system as well as large banks. But while attacks from nation-states and hacktivists tend to grab headlines, it’s the stealthy, unannounced DDoS attacks, such as those against Web applications, that are more likely to gum up the works for bank websites for short periods and are in fact more numerous, Turner noted. They’re meant to test the strength of defenses or to distract the target from another type of attack. For example, a DDoS attack may be used as smokescreen for online banking fraud or some other type of financially motivated fraud. In Neustar’s study, 30% of U.S. financial services industry respondents said they suffered malware or virus installation and theft as a result of a DDoS attack. “What I hear from our clients is that DDoS is sometimes used as a method to divert security staff so that financial fraud can get through,” said Avivah Litan, vice president at Gartner. “But these occurrences seem to be infrequent.” Her colleague Lawrence Orans, a research vice president for network security at Gartner, sounded skeptical about the frequency of DDoS-as-decoy schemes. “I think there is some fear-mongering associated with linking DDoS attacks with bank fraud,” he said. However, “the FBI has issued warnings about this in the past, so there is some validity to the issue of attackers using DDoS attacks as a smokescreen to distract a bank’s security team while the attacker executes fraudulent transactions.” According to Verisign’s iDefense team, DDoS cybercriminals are also stepping up their attacks on point-of-sale systems and ATMs. “We believe this trend will continue throughout 2015 for financial institutions,” Pandrangi said. “Additionally, using an outdated operating system invites malware developers and other cyber-criminals to exploit an organization’s networks. What’s worse is that thousands of ATMs owned by the financial sector in the U.S. are running on the outdated Windows XP operating system, making it vulnerable to becoming compromised.” Six-Figure Price Tag DDoS attacks are unwelcome at any cost. Neustar’s study puts a price tag on the harm banks suffer during such attacks: $100,000 an hour for most banks that were able to quantify it. More than a third of the financial services firms surveyed reported costs of more than that. “Those losses represent what companies stand to lose during peak hours of transactions on their websites,” said Abrams. “That doesn’t even begin to cover the losses in terms of expenses going out. For example, many attacks require six to ten professionals to mitigate the attack once it’s under way. That’s a lot of salaries going out that also represent losses for the company.” Survey respondents also complained about the damage to their brand and customer trust during and after DDoS attacks. “That gets more difficult to quantify in terms of losses to an overall brand, but it’s a significant concern,” Abrams said. To some, the $100,000 figure seems high. “Banks have other channels for their customers — mainly branch, ATM and phone — so I don’t see that much revenue being lost,” said Litan. Other recent studies have also attempted to quantify the cost of a DDoS attack. A study commissioned by Incapsula surveyed IT managers from 270 North American organizations and found that the average cost of an attack was $40,000 an hour: 15% of respondents put the cost at under $5,000 an hour; 15% said it was more than $100,000. There’s no question banks have had to spend millions in aggregate to mitigate DDoS risks. “They created more headroom by buying more bandwidth and by scaling the capacity of their web infrastructure — for example, by buying more powerful web servers,” said Orans. “And they continue to spend millions on DDoS mitigation services. That’s where the real pain has been — the attackers forced the banks to spend a lot of money on DDoS mitigation.” Source: http://www.americanbanker.com/news/bank-technology/banks-lose-up-to-100khour-to-shorter-more-intense-ddos-attacks-1073966-1.html?zkPrintable=1&nopagination=1

Taken from:
Banks Lose Up to $100K/Hour to Shorter, More Intense DDoS Attacks

The rise and rise of bad bots – little DDoS

Many will be familiar with the term bot, short for web-robot. Bots are essential for effective operation of the web: web-crawlers are a type of bot, automatically trawling sites looking for updates and making sure search engines know about new content. To this end, web site owners need to allow access to bots, but they can (and should) lay down rules. The standard here is to have a file associated with any web server called robots.txt that the owners of good bots should read and adhere too. However, not all bots are good; bad bots can just ignore the rules! Most will also have heard of botnets, arrays of compromised users devices and/or servers that have illicit background tasks running to send spam or generate high volumes of traffic that can bring web servers to their knees through DDoS (distributed denial of service) attacks. A Quocirca research report, Online Domain Maturity, published in 2014 and sponsored by Neustar (a provider of DDoS mitigation and web site protection/performance services), shows that the majority of organisations say they have either permanent or emergency DDoS protection in place, especially if they rely on websites to interact with consumers. However, Neustar’s own March 2015, EMEA DDoS Attacks and Protection Report, shows that in many cases organisations are still relying on intrusion prevention systems (IPS) or firewalls rather than custom DDoS protection. The report, which is based on interviews with 250 IT managers, shows that 7-10% of organisations believe they are being attacked at least once a week. Other research suggests the situation may actually be much worse than this, but IT managers are simply not aware of it. Corero (another DDoS protection vendor) shows in its Q4 2014 DDoS Trends and Analysis report, which uses actual data regarding observed attacks, that 73% last less than 5 minutes. Corero says these are specifically designed to be short lived and go unnoticed. This is a fine tuning of the so-called distraction attack. Arbor (yet another DDoS protection vendor) finds distraction to be the motivation for about 19-20% of attacks in its 2014 Worldwide Infrastructure Security Report. However, as with Neustar, this is based on what IT managers know, not what they do not know. The low level, sub-saturation, DDoS attacks, reported by Corero are designed to go unnoticed but disrupt IPS and firewalls for just long enough to perpetrate a more insidious targeted attack before anything has been noticed. Typically it takes an IT security team many minutes to observe and respond to a DDoS attack, especially if they are relying on an IPS. That might sound fast, but in network time it is eons; attackers can easily insert their actual attack during the short minutes of the distraction. So there is plenty of reason to put DDoS protection in place (other vendors include Akamai/Prolexic, Radware and DOSarrest ). However, that is not the end of the bot story. Cyber-criminals are increasingly using bots to perpetrate another whole series of attacks. This story starts with another, sometimes, legitimate and positive activity of bots – web scraping; the subject of a follow on blog – The rise and rise of bad bots – part 2 – beyond web scraping. Source: http://www.computerweekly.com/blogs/quocirca-insights/2015/04/the-rise-and-rise-of-bad-bots.html

Continued here:
The rise and rise of bad bots – little DDoS

DDoS attack temporarily blocks seattletimes.com

A denial-of-service attack, in which perpetrators flood a targeted website with requests that overwhelm the site’s servers, is believed to have caused Monday morning’s outage. A cyberattack took down The Seattle Times website for about 90 minutes Monday morning. Seattletimes.com was unavailable from about 8 a.m. to 9:30 a.m. as a result of a denial-of-service attack, company spokeswoman Jill Mackie said. “The Seattle Times website experienced technical problems Monday morning due to an external attack that appears to have targeted other sites,” Mackie said in a statement. “We continue to monitor the situation and apologize for any inconvenience this caused readers.” Denial-of-service attacks are designed to flood a website with requests, essentially overwhelming the site’s servers and preventing it from responding to other users. The result is a site that grinds to a halt or runs so slowly that it becomes unusable. Such attacks on their own aren’t designed to damage a target’s computer systems or steal files. The attacks, a fixture of Internet security threats for decades, have been blamed on culprits ranging from political operatives to young, tech-savvy hackers connected by social media. The ease with which such attacks could be orchestrated was illustrated in 2000 when a 15-year-old Canadian boy, working under the alias “Mafiaboy,” was able to temporarily bring down the websites of Yahoo, CNN and Amazon.com, among others. Mackie said The Seattle Times’ information technology staff believes Monday’s attack on the website was carried out by a cyberattack group that calls itself Vikingdom2015. The group is said to have targeted several government and media websites, including those of the Indiana state government and the Bangor (Maine) Daily News, with denial-of-service attacks. IBM security researchers said the group was formed from former members of the Team Cyber Poison hacker group, and began attacking websites this month. Source: http://www.seattletimes.com/business/technology/cyberattack-temporarily-blocks-seattletimescom/

View original post here:
DDoS attack temporarily blocks seattletimes.com

GitHub Still Battling DDoS Attack

San Francisco-based GitHub was taken out with a denial of service attack Wednesday. Scripts from the Beijing-based Baidu sent traffic coming to a page operated by GreatFire and a page with Chinese-translations of The New York Times. As is the focus of DDoS attacks, GitHub’s availability was knocked out as a result of the traffic caused. In morning tweets during the attack, GitHub informed followers that the attack was still going and getting worse, but that they were on top of dealing with it. As of two hours ago GitHub states that the DDoS attack is still being worked on. Meanwhile Baidu has said that it had nothing to do with the attack intentionally. The Chinese search engine titan also says that it is working security specialists to find out the cause of things. The company made certain to state that its security hadn’t been compromised during the attack on GitHub. Speculation in tech and security circles say that the attack was a means of strengthening China’s methods of web censorship by taking out sites that could allow for users to get around it. Baidu was simply used as a means of amplifying the attack due to how sizable it is and the amount of traffic it can produce. Source: http://kabirnews.com/github-still-battling-ddos-attack/8495/

See more here:
GitHub Still Battling DDoS Attack

DDoS attacks enabled via vulnerable Google Maps plugin

An industry warning has been issued to businesses and Software-as-a-Service providers advising that attackers are currently exploiting a vulnerable Google Maps plugin installed on Joomla servers to launch distributed denial of service (DDoS) attacks. “Vulnerabilities in web applications hosted by Software-as-a-Service providers continue to provide ammunition for criminal entrepreneurs. Now they are preying on a vulnerable Joomla plugin for which they’ve invented a new DDoS attack and DDoS-for-hire tools,” said Stuart Scholly, senior vice president and general manager at the Security Business Unit, Akamai Technologies. “This is one more web application vulnerability in a sea of vulnerabilities.” The vulnerability found in the Google Maps plugin for Joomla allows the platform to act as a proxy, enabling attackers to process fake requests and return the proxy results to a targeted user in the form of a DDoS attack. The source of the attack remains anonymous as the hack-related traffic appears to come from the Joomla servers. Figures released in February 2014 showed that Joomla, the second most frequently used online content management system after WordPress, had been downloaded over 50 million times. Working with Phishlab R.A.I.D, Akamai’s Prolexic Security Engineering and Research Team (PLXsert) were able to match the DDoS signature traffic coming from a number of Joomla sites, suggesting that the vulnerable plugins are currently being used to execute a large amount of reflected GET flood DDoS attacks. The research has also found that the attack vector is being advertised over popular DDoS-for-hire websites. PLXsert identified over 15,000 supposed Joomla reflectors online. Despite many of the vulnerable plugins having been patched, removed or reconfigures, many of the servers remain open to attack. Reflection techniques to conduct DDoS attacks are extremely common, with 39% of all DDoS traffic employing reflection to bounce malware off third-party servers and to hide the attackers’ identity. Source: http://thestack.com/ddos-attacks-vulnerable-google-maps-plugin-020315

Read this article:
DDoS attacks enabled via vulnerable Google Maps plugin

Massive DDoS Brute-Force Campaign Targets Linux Rootkits

A brute force campaign looking to set up a distributed denial of service (DDoS) botnet using a rare Linux rootkit malware has been launched, emanating from the servers of a Hong Kong-based company called Hee Thai Limited. The malware, known as XOR.DDoS, was first spotted in September by security research firm Malware Must Die. But security firm FireEye says that new variants have been making their way into the wild, as recently as Jan.20. XOR.DDoS is installed on targeted systems via SSH (Secure Shell) brute-force attacks that target both servers and network devices. And these are being carried out using complex attack scripts to serve the malware through a sophisticated distribution scheme that allows the attackers to compile and deliver tailored rootkits on-demand, to infect x86 and mobile ARM systems alike. Once infected, the hosts are enlisted to launch DDoS attacks. “While typical DDoS bots are straightforward in operation and often programmed in a high-level script such as PHP or Perl, the XOR.DDoS family is programming in C/C++ and incorporates multiple persistence mechanisms including a rare Linux rootkit,” FireEye researchers noted in an analysis. What’s notable about the Hee Thai attack is the sheer scale of the operation. Within 24 hours of first sighting back in November, FireEye had observed well over 20,000 SSH login attempts, per server. By the end of January, each server had seen nearly 1 million login attempts. During this time period, traffic from 103.41.124.0/24 accounted for 63% of all observed port 22 traffic. “Someone with a lot of bandwidth and resources really wanted to get into our servers,” FireEye researcher noted. They also said that the campaign has been evolving. At the beginning, each IP address would attempt more than 20,000 passwords before moving on. It then dropped to attempting a few thousand passwords before cycling to the next, and repeat attacks also began to occur. Now, a new stage of the Hee Thai campaign is more chaotic than the previous two. “The attacks now occur en masse and at random, frequently with multiple IPs simultaneously targeting the same server,” FireEye explained. The Hee Thai campaign also features an on-demand malware build system. Using a sophisticated set of build systems, the malware harvests kernel headers and version strings from victims to deliver customized malware that may be compiled on-demand to deliver XOR.DDoS to the target machine. This strategy makes hash signature-based detection systems ineffective for detecting XOR.DDoS. “Brute force attacks are one of the oldest types of attacks,” FireEye researchers said. “Due to its ubiquity, there are numerous solutions available for defending against them. However a great many systems are vulnerable. Even in enterprise settings, network devices and servers in forgotten branch offices could be exposed to this threat.” Source: http://www.infosecurity-magazine.com/news/massive-ddos-bruteforce-targets/

Read the article:
Massive DDoS Brute-Force Campaign Targets Linux Rootkits