Tag Archives: dos attacks

DDoS attacks threatens New Zealand organisations

The New Zealand Internet Task Force (NZITF) advises that an unknown international group has this week begun threatening New Zealand organisations with Distributed Denial of Service (DDoS) attacks. DDoS attacks are attempts to make an organisation’s Internet links or network unavailable to its users for an extended length of time. This latest DDoS threat appears as an email threatening to take down an organisation’s Internet links unless substantial payments in the digital currency Bitcoin are made. New Zealand Internet Task Force (NZITF) Chair Barry Brailey warns the threat is not an idle one and should be taken extremely seriously as the networks of some New Zealand organisations have already been targetted. “The networks of at least four New Zealand organisations that NZITF knows of have been affected, so far. A number of Australian organisations have also been affected,” he says. “This unknown group of criminals have been sending emails to a number of addresses within an organisation. Sometimes these are support or helpdesk addresses, other times they are directed at individuals. The emails contain statements threatening DDoS, such as: “Your site is going under attack unless you pay 25 Bitcoin.”, “We are aware that you probably don’t have 25 BTC at the moment, so we are giving you 24 hours.” or “IMPORTANT: You don’t even have to reply. Just pay 25 BTC to [bitcoin address] – we will know it’s you and you will never hear from us again.” The emails may also provide links to news articles about other attacks the group has conducted. NZITF urges New Zealand firms and organisations to be on the alert. They also suggest that targeted entities don’t pay as even if this stops a current attack, it makes your organisation a likely target for future exploitation as you have a history of making payments. It is also advisable staff be educated and be on the lookout for any emails matching the descriptions above. Have them alert appropriate security personnel within the organisation as soon as possible. Source: http://www.geekzone.co.nz/content.asp?contentid=18336

See the original post:
DDoS attacks threatens New Zealand organisations

MTN suffers a DDoS attack

Connectivity at MTN’s Gallo Manor data centre has been fully restored after the Johannesburg site was hit by a distributed denial of service (DDoS) attack earlier this afternoon. MTN alerted clients just after 3pm today that it had suffered a DDoS attack, which resulted in packet loss and a disturbance to clients’ cloud services.  At the time the company said MTN Business’ network operations centre was working on resolving the problem to avoid any further attacks. This comes less than two days after a power outage at the same data centre caused loss of connectivity. MTN chief technology officer Eben Albertyn says, while the DDoS attack today hampered the company’s ability to provide connectivity services, engineers worked “fervently” to fully restore services and avert further attacks, and connectivity was restored soon after. “The interruption lasted only a few minutes and is completely unrelated to the outage experienced on Monday. MTN wishes to apologise profusely to its customers for any inconvenience caused.” On Sunday evening just after 6pm, MTN’s Gallo Manor data centre went offline, causing major disruptions to clients’ services, including Afrihost. MTN put the outage down to a power outage. The problem persisted until the next day, with services being restored around 11am on Monday. Digital Attack Map defines DDoS attack as: “An attempt to make an online service unavailable by overwhelming it with traffic from multiple sources.”  The live data site notes these attacks can target a wide variety of important resources, from banks to news Web sites, and present a major challenge to making sure people can publish and access important information. Source: http://www.itweb.co.za/index.php?option=com_content&view=article&id=142968:MTN-weathers-DDOS-attack

View original post here:
MTN suffers a DDoS attack

Staffordshire school hit by suspected DDoS attack

A secondary school near Burton-on-Trent in East Staffordshire had admitted that its website was knocked offline at various points by hackers in recent weeks. The attack appears to be denial-of-service (DoS), with hactivism group Anonymous reportedly taking responsibility. Burton Mail reports that John Taylor High School’s website suffered from “significant periods of downtime during the past few weeks”, adding that a member of Anonymous had contacted the newspaper directly to claim responsibility. “It has come to our immediate attention that the security used for school systems is not up to scratch,” the member said when interviewed. “This is putting pupils at risk of being targeted by paedophiles who have acquired the skill to access data which could lead them to being able to collect information and stalk pupils.” The member continued: “We give every school in this country one month to fix their pitiful security systems. If, after that time, we can still achieve penetration at a reasonable level of attack we will personally disable their systems. “We do not expect them to be able to stop us at an advanced level, however the level of hack used on the John Taylor systems highlighted a very serious problem.” Mike Donoghue, head teacher at John Taylor, which has around 1,500 students, confirmed that they were still investigating the incident, and added that the systems are now fully functional. Speaking to SCMagazineUK.com earlier today, Donoghue drilled down into some of the details, confirming that the outage related specifically to The Vault, a virtual learning environment – developed by FROG but hosted on the school’s servers – which is used to host teaching materials, former test papers and other revision guidance. The school, a specialist ‘science and leadership academy;, was first alerted to the downtime by students, on Twitter, who were trying to access the system, with IT technicians subsequently blocking all IP address so no-one could access the service. The second outage lasted a “couple of days” over the Easter Bank Holiday weekend. Donoghue was keen to stress that there was “no breach” of sensitive student data, and said that the school continues to work with providers RM and FROG to monitor the issue, and harden their existing security controls. Students were informed of the issue during assembly, and parents have also been made aware. The principal said that the effect of the incident was “largely mitigated” because the downtime wasn’t overly long, and most of the materials could still be accessed by “just a few more clicks on Google”. He also doubted the possibility of Anonymous being behind the attack, saying that the outages stopped after students were alerted to the situation. Source: http://www.scmagazineuk.com/staffordshire-school-hit-by-suspected-denial-of-service-attack/article/412236/

Read this article:
Staffordshire school hit by suspected DDoS attack

Community college targeted ongoing DDoS attack

Walla Walla Community College is under cyberattack this week by what are believed to be foreign computers that have jammed the college’s Internet systems. Bill Storms, technology director, described it as akin to having too many cars on a freeway, causing delays and disruption to those wanting to connect to the college’s website. The type of attack is a distributed denial of service, or DDoS. They’re often the result of hundreds or even thousands of computers outside the U.S. that are programed with viruses that continually connect to and overload targeted servers. Storms said bandwidth monitors noticed the first spike of attacks on Sunday. To stop the attacks, college officials have had to periodically shut down the Web connection while providing alternative working Internet links to students and staff. The fix, so far, has only been temporary as the problem often returns the next day. “We think we have it under control in the afternoon. And we have a quiet period,” Storm said. “And then around 9 a.m. it all comes in again.” Walla Walla Community College may not be the only victim of the DDoS attack. Storm said he was informed that as many as 39 other state agencies have been the target of similar DDoS attacks. As for the reason for the attack, none was given to college officials. Storms noted campus operators did receive a number of unusual phone calls where the callers said that they were in control of the Internet. But no demands were made. “Some bizarre phone calls came in, and I don’t know whether to take them serious or not,” Storms said. State officials have been contacted and are aiding the college with the problem. Storms said they have idea how long the DDoS attack will last. Source: http://union-bulletin.com/news/2015/apr/30/community-college-targeted-ongoing-cyberattack/

Continued here:
Community college targeted ongoing DDoS attack

FBI investigating Rutgers University in DDoS attack

The FBI is working with Rutgers University to identify the source of a series of distributed denial-of-service (DDoS) attacks that have plagued the school this week. The assault began Monday morning and took down internet service across the campus according to NJ.com. Some professors had to cancel classes and students were unable to enroll, submit assignments or take finals since Wi-fi service and email have been affected as has an online resource called Sakai. This is the second DDoS attack on the university this month and the third since November. Authorities and the Rutgers Office of Information and Technology (OIT) haven’t released any details thus far about the possible source of the attacks. Currently, only certain parts of the university have internet service. The school will make frequent updates on to the Rutgers website about its progress in restoring service. Source: http://www.scmagazine.com/the-fbi-is-helpign-rutger-inveigate-a-series-of-ddos-attack/article/412149/

See the original post:
FBI investigating Rutgers University in DDoS attack

One fifth of DDoS attacks last over a day

Some 20 per cent of DDoS attacks have lasting damage that can see them taking a site down for 24 hours or more, according to research by Kaspersky. In fact, almost a tenth of the companies surveyed said their systems were down for several weeks or longer, while less than a third said they had disruption lasting less than an hour. The investigation revealed that the majority of attacks (65 per cent) caused severe delays or complete disruption, while only a third caused no disruption at all. Evgeny Vigovsky, head of Kaspersky DDoS Protection, said: “For companies, losing a service completely for a short time, or suffering constant delays in accessing it over several days, can be equally serious problems. “Both situations can impact customer satisfaction and their willingness to use the same service in the future. Using reliable security solutions to protect against DDoS attacks enables companies to give their customers uninterrupted access to online services, regardless of whether they are facing a powerful short-term assault or a weaker but persistent long-running campaign.” The company highlighted an attack on Github at the end of March when Chinese hackers brought the site down. That attack lasted 118 hours and demonstrated that even large communities are at risk. Last month, another study by Kaspersky revealed that only 37 per cent of companies were prepared for a DDoS attack, despite 26 per cent of them being concerned the problems caused by such attacks were long-term, meaning they could lose current or prospective clients as a result. Source: http://www.itpro.co.uk/security/24514/one-fifth-of-ddos-attacks-last-over-a-day

More:
One fifth of DDoS attacks last over a day

Featured article: How to use a CDN properly and make your website faster

Its one of the biggest mysteries to me I have seen in my 15+ years of Internet hosting and cloud based services. The mystery is, why do people use a Content Delivery Network for their website yet never fully optimize their site to take advantage of the speed and volume capabilities of the CDN. Just because you use a CDN doesn’t mean your site is automatically faster or even able to take advantage of its ability to dish out mass amounts of content in the blink of an eye. At DOSarrest I have seen the same mystery continue, this is why I have put together this piece on using a CDN and hopefully help those who wish to take full advantage of a CDN. Most of this information is general and can be applied to using any CDN but I’ll also throw in some specifics that relate to DOSarrest. Some common misconceptions about using a CDN As soon as I’m configured to use a CDN my site will be faster and be able to handle a large amount of web visitors on demand. Website developers create websites that are already optimized and a CDN won’t really change much. There’s really nothing I can do to make my website run faster once its on a CDN. All CDN’s are pretty much the same. Here’s what I have to say about the misconceptions noted above In most cases the answer to this is…. NO !! If the CDN is not caching your content your site won’t be faster, in fact it will probably be a little slower, as every request will have to go from the visitor to the CDN which will in turn go and fetch it from your server then turn around and send the response back to the visitor. In my opinion and experience website developers in general do not optimize websites to use a CDN. In fact most websites don’t even take full advantage of a browsers’ caching capability. As the Internet has become ubiquitously faster, this fine art has been left by the wayside in most cases. Another reason I think this has happened is that websites are huge, complex and a lot of content is dynamically generated coupled with very fast servers with large amounts of memory. Why spend time on optimizing caching, when a fast server will overcome this overhead. Oh yes you can and that’s why I have written this piece…see below No they aren’t. Many CDN’s don’t want you know how things are really working from every node that they are broadcasting your content from. You have to go out and subscribe to a third party service, if you have to get a third party service, do it, it can be fairly expensive but well worth it. How else will you know how your site is performing from other geographic regions. A good CDN should let you know the following in real-time but many don’t. Number of connections/requests between the CDN and Visitors. Number of connections/requests between the CDN and your server (origin). You want try and have the number of requests to your server to be less than the number of requests from the CDN to your visitors. *Tip- Use HTTP 1.1 on both “a” & “b” above and try and extend the keep-alive time on the origin to CDN side Bandwidth between the CDN and Internet visitors Bandwidth between the CDN and your server (origin) *Tip – If bandwidth of “c” and “d” are about the same, news flash…You can make things better. Cache status of your content (how many requests are being served by the CDN) *Tip – This is the best metric to really know if you are using your CDN properly. Performance metrics from outside of the CDN but in the same geographic region *Tip- Once you have the performance metrics from several different geographic regions you can compare the differences once you are on a CDN, your site should load faster the further away the region is located from your origin server, if you’re caching properly. For the record DOSarrest provides all of the above in real-time and it’s these tools I’ll use to explain on how to take full advantage of any CDN but without any metrics there’s no scientific way to know you’re on the right track to making your site super fast. There are five main groups of cache control tags that will effect how and what is cached. Expires : When attempting to retrieve a resource a browser will usually check to see if it already has a copy available for reuse. If the expires date has past the browser will download the resource again. Cache-control : HTTP 1.1 this expands on the functionality offered by Expires. There are several options available for the cache control header: – Public : This resource is cacheable. In the absence of any contradicting directive this is assumed. – Private : This resource is cachable by the end user only. All intermediate caching devices will treat this resource as no-cache. – No-cache : Do not cache this resource. – No-store : Do not cache, Do not store the request, I was never here – we never spoke. Capiche? – Must-revalidate : Do not use stale copies of this resource. – Proxy-revalidate : The end user may use stale copies, but intermediate caches must revalidate. – Max-age : The length of time (in seconds) before a resource is considered stale. A response may include any combination of these headers, for example: private, max-age=3600, must-revalidate. X-Accel-Expires : This functions just like the Expires header, but is only intended for proxy services. This header is intended to be ignored by browsers, and when the response traverses a proxy this header should be stripped out. Set-Cookie : While not explicitly specifying a cache directive, cookies are generally designed to hold user and/or session specific information. Caching such resources would have a negative impact on the desired site functionality. Vary : Lists the headers that should determine distinct copies of the resource. Cache will need to keep a separate copy of this resource for each distinct set of values in the headers indicated by Vary. A Vary response of “ * “ indicates that each request is unique. Given that most websites in my opinion are not fully taking advantage of caching by a browser or a CDN, if you’re using one, there is still a way around this without reviewing and adjusting every cache control header on your website. Any CDN worth its cost as well as any cloud based DDoS protection services company should be able to override most website cache-control headers. For demonstration purposes we used our own live website DOSarrest.com and ran a traffic generator so as to stress the server a little along with our regular visitor traffic. This demonstration shows what’s going on, when passing through a CDN with respect to activity between the CDN and the Internet visitor and the CDN and the customers server on the back-end. At approximately 16:30 we enabled a feature on DOSarrest’s service we call “Forced Caching” What this does is override in other words ignore some of the origin servers cache control headers. These are the results: Notice that bandwidth between the CDN and the origin (second graph) have fallen by over 90%, this saves resources on the origin server and makes things faster for the visitor. This is the best graphic illustration to let you know that you’re on the right track. Cache hits go way up, not cached go down and Expired and misses are negligible. The graph below shows that the requests to the origin have dropped by 90% ,its telling you the CDN is doing the heavy lifting. Last but not least this is the fruit of your labor as seen by 8 sensors in 4 geographic regions from our Customer “ DEMS “ portal. The site is running 10 times faster in every location even under load !

Follow this link:
Featured article: How to use a CDN properly and make your website faster

Thirty Meter Telescope website falls over in hacktivist DDoS attack

Hacktivists have launched a distributed denial-of-service attack against the website of TMT (Thirty Meter Telescope), which is planned to be the Northern hemisphere’s largest, most advanced optical telescope. For at least two hours yesterday, the TMT website at www.tmt.org was inaccessible to internet users. Sandra Dawson, a spokesperson for the TMT project, confirmed to the Associated Press that the site had come under attack: “TMT today was the victim of an unscrupulous denial of service attack, apparently launched by Anonymous. The incident is being investigated.” You might think that a website about a telescope is a strange target for hackers wielding the blunt weapon of a DDoS attack, who might typically be more interested in attacking government websites for political reasons or taking down an unpopular multinational corporation. Why would hackers want to launch such a disruptive attack against a telescope website? Surely the only people who don’t like telescopes are the aliens in outer space who might be having their laundry peeped at from Earth? It turns out there’s a simple reason why the Thirty Meter Telescope is stirring emotions so strongly: it hasn’t been built yet. The construction of the proposed TMT is controversial because it is planned to be be constructed on Mauna Kea, a dormant 13,796 foot-high volcano in Hawaii. This has incurred the wrath of environmentalists and native Hawaiians who consider the land to be sacred. There has been considerable opposition to the building of the telescope on Mauna Kea, as this news report from last year makes clear. Now it appears the protest about TMT has spilt over onto the internet in the form of a denial-of-service attack. Operation Green Rights, an Anonymous-affiliated group which also campaigns against controversial corporations such as Monsanto, claimed on its Twitter account and website that it was responsible for the DDoS attack. The hacktivists additionally claimed credit for taking down Aloha State’s official website. It is clear that denial-of-service attacks are being deployed more and more, as perpetrators attempt to use the anonymity of the internet to hide their identity and stage the digital version of a “sit down protest” or blockade to disrupt organisations. Tempting as it may be to participate in a DDoS attack, it’s important that everyone remembers that if the authorities determine you were involved you can end up going to jail as a result. Peaceful, law-abiding protests are always preferable. Source: http://www.welivesecurity.com/2015/04/27/tmt-website-ddos/

Continue Reading:
Thirty Meter Telescope website falls over in hacktivist DDoS attack

DDoS attack brings down TRAI’s website

Indian telecom regulator TRAI’s official website was on Monday brought down by a hacker group called Anonymous India following the public release of email IDs from which the government body received responses regarding net neutrality. The group also warned TRAI of being hacked soon. “TRAI down! Fuck you http://trai.gov.in  for releasing email IDs publicly and helping spammers. You   will be hacked soon,” AnonOpsIndia tweeted. The group claimed to launch a DDoS (distributed denial-of-service) attack on the website to make it inaccessible. Slamming the government portal, the group posted: “#TRAI is so incompetent lol They have any clue how to tackle a DDoS?” “But just an alarm for whole #India. You trust incompetent #TRAI who don’t know how to deal with DDoS? Seriously sorry guys. Goodluck!,” it added. Taking a dig at the personnel at TRAI, it tweeted: “Somebody call ‘brilliant minds’ at TRAI and tell them to stop eating samosas and get back to work coz DDoS attack has stopped from here.” In a response to a Twitter user about the attack, Anonymous India said it was “just preventing spammers from accessing those Email IDs posted by Trai publicly.” It said that TRAI is incompetent in dealing with internet. “So those who still think that #TRAi can “handle” the Internet, we just proved you wrong.They just got trolled by bunch of kids.#Incompetence,” the hacker group tweeted. Following tweets suggesting the hacker group to stop their actions, Anonymous India did same. However, the group compalined that no action was taken on those email ids which were revealed. “Guys http://trai.gov.in  is back online and they still haven’t done anything about those Email IDs. You guys told us to stop. We did,” it tweeted. “So if you guys still think you can have a chat with incompetent #TRAi, go ahead. But WE ARE WATCHING!,” the group posted. Source: http://indiablooms.com/ibns_new/news-details/N/10099/hacker-group-brings-down-trai-s-website.html

View article:
DDoS attack brings down TRAI’s website

A Javascript-based DDoS Attack as seen by Safe Browsing

To protect users from malicious content, Safe Browsing’s infrastructure analyzes web pages with web browsers running in virtual machines. This allows us to determine if a page contains malicious content, such as Javascript meant to exploit user machines. While machine learning algorithms select which web pages to inspect, we analyze millions of web pages every day and achieve good coverage of the web in general. In the middle of March, several sources reported a large Distributed Denial-of-Service attack against the censorship monitoring organization GreatFire. Researchers have extensively analyzed this DoS attack and found it novel because it was conducted by a network operator that intercepted benign web content to inject malicious Javascript. In this particular case, Javascript and HTML resources hosted on baidu.com were replaced with Javascript that would repeatedly request resources from the attacked domains. While Safe Browsing does not observe traffic at the network level, it affords good visibility at the HTTP protocol level. As such our infrastructure picked up this attack, too. Using Safe Browsing data, we can provide a more complete timeline of the attack and shed light on what injections occurred when. For this blog post, we analyzed data from March 1st to April 15th 2015. Safe Browsing first noticed injected content against baidu.com domains on March 3rd, 2015. The last time we observed injections during our measurement period was on April 7th, 2015. This is visible in the graph below, which plots the number of injections over time as a percentage of all injections observed: We noticed that the attack was carried out in multiple phases. The first phase appeared to be a testing stage and was conducted from March 3rd to March 6th. The initial test target was 114.113.156.119:56789 and the number of requests was artificially limited. From March 4rd to March 6th, the request limitations were removed. The next phase was conducted between March 10th and 13th and targeted the following IP address at first: 203.90.242.126. Passive DNS places hosts under the sinajs.cn domain at this IP address. On March 13th, the attack was extended to include d1gztyvw1gvkdq.cloudfront.net. At first, requests were made over HTTP and then upgraded to to use HTTPS. On March 14th, the attack started for real and targeted d3rkfw22xppori.cloudfront.net both via HTTP as well as HTTPS. Attacks against this specific host were carried out until March 17th. On March 18th, the number of hosts under attack was increased to include the following: d117ucqx7my6vj.cloudfront.net, d14qqseh1jha6e.cloudfront.net, d18yee9du95yb4.cloudfront.net, d19r410x06nzy6.cloudfront.net, d1blw6ybvy6vm2.cloudfront.net. This is also the first time we find truncated injections in which the Javascript is cut-off and non functional. At some point during this phase of the attack, the cloudfront hosts started serving 302 redirects to greatfire.org as well as other domains. Substitution of Javascript ceased completely on March 20th but injections into HTML pages continued. Whereas Javascript replacement breaks the functionality of the original content, injection into HTML does not. Here HTML is modified to include both a reference to the original content as well as the attack Javascript as shown below: [… regular attack Javascript …] In this technique, the web browser fetches the same HTML page twice but due to the presence of the query parameter t, no injection happens on the second request. The attacked domains also changed and now consisted of: dyzem5oho3umy.cloudfront.net, d25wg9b8djob8m.cloudfront.net and d28d0hakfq6b4n.cloudfront.net. About 10 hours after this new phase started, we see 302 redirects to a different domain served from the targeted servers. The attack against the cloudfront hosts stops on March 25th. Instead, resources hosted on github.com were now under attack. The first new target was github.com/greatfire/wiki/wiki/nyt/ and was quickly followed by github.com/greatfire/ as well as github.com/greatfire/wiki/wiki/dw/. On March 26th, a packed and obfuscated attack Javascript replaced the plain version and started targeting the following resources: github.com/greatfire/ and github.com/cn-nytimes/. Here we also observed some truncated injections. The attack against github seems to have stopped on April 7th, 2015 and marks the last time we saw injections during our measurement period. From the beginning of March until the attacks stopped in April, we saw 19 unique Javascript replacement payloads as represented by their MD5 sum in the pie chart below. For the HTML injections, the payloads were unique due to the injected URL so we are not showing their respective MD5 sums. However, the injected Javascript was very similar to the payloads referenced above. Our systems saw injected content on the following eight baidu.com domains and corresponding IP addresses: cbjs.baidu.com (123.125.65.120) eclick.baidu.com (123.125.115.164) hm.baidu.com (61.135.185.140) pos.baidu.com (115.239.210.141) cpro.baidu.com (115.239.211.17) bdimg.share.baidu.com (211.90.25.48) pan.baidu.com (180.149.132.99) wapbaike.baidu.com (123.125.114.15) The sizes of the injected Javascript payloads ranged from 995 to 1325 bytes. We hope this report helps to round out the overall facts known about this attack. It also demonstrates that collectively there is a lot of visibility into what happens on the web. At the HTTP level seen by Safe Browsing, we cannot confidently attribute this attack to anyone. However, it makes it clear that hiding such attacks from detailed analysis after the fact is difficult. Had the entire web already moved to encrypted traffic via TLS, such an injection attack would not have been possible. This provides further motivation for transitioning the web to encrypted and integrity-protected communication. Unfortunately, defending against such an attack is not easy for website operators. In this case, the attack Javascript requests web resources sequentially and slowing down responses might have helped with reducing the overall attack traffic. Another hope is that the external visibility of this attack will serve as a deterrent in the future. Source: http://googleonlinesecurity.blogspot.ca/2015/04/a-javascript-based-ddos-attack-as-seen.html

Originally posted here:
A Javascript-based DDoS Attack as seen by Safe Browsing