Home » Featured Article

Category: Featured Article

Small Business Advertisers Will Lose $15K This Year To Fraud

By Yuval Haimov, CEO, ClickCease

In a nightmare year for small businesses, reliance on digital marketing grows

While 2020 is quickly turning out to be the worst economic year since the end of World War II, it would seem like the small business sector has taken one of the hardest hits. Restaurants, bars, hairdressers, real-estate agents, and other types of businesses have completely shut down as a result of the global COVID-19 pandemic. Many of the small businesses who have managed to stay afloat are those who rely more on digital channels than on physical, brick-and-mortar locations. These are businesses that sell online and promote themselves primarily through online advertising on paid search and paid social channels. Facebook, for example, recognizing the growing reliance on digital channels, started a $100 million relief fund for small businesses, including significant ad buying credit. Google itself announced $340 million worth of ad credit would be given to SMB advertisers to battle the effects of COVID-19.

But new data shows US small businesses will lose 11% of their ad spend to click fraud 

As small businesses divert more funds to digital marketing channels, their exposure to the risks of online advertising grows too. Small businesses in particular rely less on top-of-the-funnel brand awareness campaigns and tend to focus their limited funds on bottom-of-the-line performance marketing, typically on PPC (pay-per-click) buying channels. But a recent study by Professor Roberto Cavazos of the University of Baltimore showed that PPC channels are extremely susceptible to click fraud and fake bot traffic, costing the global ad economy almost $24 billion in wasted ad spend. Now ClickCease has released new data showing that US small businesses will lose an average 11% of their ad spend in 2020 to click fraud. That amounts to $15K worth of annual ad spend losses for a single US business. These are astronomic losses for SMBs to absorb, especially in this current economic climate.

What is click fraud and why is it hurting small business advertisers?

Click fraud is the generation of fraudulent ad clicks. These can be done by individuals, by click farms, or even by sophisticated networks of bots. Why do people generate fake clicks? For a multitude of reasons. These could be other small businesses who are trying to deplete their competitors’ paid search ad spend, so that they can win the bid and be the top result on Google. This could be an affiliate site which is paid to drive traffic to a small business and is paying a click farm to click on ads so that they get paid more. These could be malicious bot attacks intending to cause all sorts of harm because a person visited a small business site and then got retargeted with ads for months on end. These could be even crawler bots that are scraping sites and landing pages for data collection purposes. All of these accumulate and end up costing advertisers dearly.

Why does this affect small businesses in particular? One obvious reason is that in times of crisis, the competitive landscape becomes so cutthroat and ad spend becomes so scarce, that many people are compelled to click on their competitors’ ads to give themselves a competitive edge. Also, the fact small businesses are moving to more online channels, itself, attracts more fraudulent activity looking to take a bite out of that ad-spend. When Google and Facebook announce hundreds of millions of dollars of ad credit for small business, you can be sure that the fraudsters are looking to pounce.

Local service providers hit hardest and COVID-19 is making it worse

Small businesses in the US are seeing more than one in ten clicks (11%) on their paid search advertising campaigns rendered invalid as a result of deliberate competitor sabotage or bot traffic. The most cutthroat click fraud occurs across the local service provider sector, as locksmith ads (71% invalid clicks), pest control ads (53%), and on-demand repair ads (44%) are hit the hardest. The main reason for this, as the ClickCease study shows, is persistent competitor clicks on high-priced keywords, designed to capture real customers.

What can small businesses do about this?

Small business advertisers should first be aware of the click fraud phenomena and the extent to which it’s hurting their business. Second, they should be looking more closely at their campaign analytics to try and monitor for suspicious activity and anomalies in activity, engagement, and conversion rates. This, of course, can become quite a complex and dizzying task, since it requires monitoring IP addresses of the users who are clicking ads, analyzing those IPs to verify their sources, comparing conversion data from different campaigns to detect anomalies and suspicious spikes in traffic, and of course blocking those IPs via the ad’s account. Businesses who want this kind of complex service done automatically and professionally should consider adopting a Click Fraud Prevention solution for protection. However, when choosing such a solution, one should make sure to not select any random vendor, but rather to look for the solutions with the best track record and reviews.

Bio

Yuval Haimov is the CEO and co-founder of Clickcease and has a strong technological background from his years serving in the IDF and the Israeli ministry of defense. He has been fighting fraud and helping SMBs achieve fraud-free campaigns for the past decade. Before launching ClickCease in 2015, Yuval spent five years as a .NET developer and team leader for Matrix, and before that was a project manager and systems analyst for IT firm High Skills and More Ltd.

Look in the Mirror and Foresee the Future of Telecommunications

By Martin Laesch, Chief Technology Officer, Neural Technologies

The adoption of 5G will unleash the full potential of augmented and virtual reality, Smart Cities, and the Internet of Things (IoT); this will present opportunities for Communications Service Providers (CSPs) to strengthen current revenue sources or create entirely new revenue streams. Consumers continue to display an insatiable appetite for data and with the consumption of data-hungry applications securing a place in consumers’ daily lives, data usage is set to continue increasing exponentially into the future.

CSPs currently face the ever-increasing challenges of leveraging 5G networks and offering customers new types of services. To overcome these challenges, new digital technologies are required to automate complex business processes to provide customers with the personalized service they have come to expect in a fast-evolving, digital world.

By 2025, CSPs should already be leveraging 5G networks to offer new types of services to various customer segments. The challenges of this endeavor will lie in the ability to scale telecom platforms, automate lifecycle management of network slices, and incorporate predictive demand and maintenance – all while ensuring operational efficiency and a behind-the-scenes workforce to support platform optimization.

Using automation to improve customer services

To address these challenges, an Analytical Data Model (Artificial Intelligence [AI] Data Model) and Machine Learning (ML) were used to develop the Digital Twins technology and tested as part of the 2019 TM Forum Digital Twins catalyst project. The technology serves as a virtual representation of a real-world entity or system, which acts as a mirror to provide a means to simulate, predict, and forecast behavior in the digital world. As part of the catalyst project, the Digital Twins technology was applied to various use cases, such as networks, individuals, organizations, and processes, to determine their effectiveness for telecom industry applications in order to address the aforementioned challenges predicted for 2025.

For the Digital Twins technology to be possible, a common data model is essential. All data needs to be classified and structured in the same way for the digital technology to perform. Digital Integration is the first step to making this possible.

One example of a Digital Twin is that of a customer. A customer’s Digital Twin will be represented in a heatmap with icons to help visualize aspects of their digital lifestyle, such as whether they spend a lot of time gaming, have high mobile usage, or are physically inactive. This twin can then be used by the CSP to tailor messages to that individual. For example, the Digital Twin may show that the customer has a low step count, which could trigger a notification to the individual to be more active.

Using a Digital Twin, operators can also determine where there will be a significant increase in latency within the network, and then share that information with the customer’s Digital Twin to find out what is going to be affected and determine the next best action.

The Digital Twin can also speed up product development cycles, save time and money, and create new business models based on intelligent outcomes. This allows enterprises to personalize the customer experience and meet their precise demands, thereby enabling the enterprises to grow and improve their customer base through targeted campaigns, tailored services, and promotions. In turn, this generates greater customer loyalty and retention as well as customer spending through personalization with timely, individually tailored offers.

Proven methods for the future

The TM Forum Digital Twins catalyst project proved that Digital Twins not only work for the manufacturing industry, but for the telecommunications space as well. As part of the project, Neural Technologies successfully created a Customer Twin alongside the collaborative development of a Mobile Network Twin and an Enterprise IP Network Twin, all originating from the core AI Data Model.

In addition, the catalyst project also demonstrated real-time communication between the twins. Using the proposed TM Forum Open Application Program Interfaces (APIs), Neural Technologies was able to share such simulated, forecasted, and predicted outcomes so that each individual twin was able to recommend a more informed action, instead of a siloed view.

Ultimately through using Digital Twin technologies in the telecommunications industry, a more holistic view across the whole of the operator’s network will be achievable, making it possible to not only make more informed recommended actions, but also make equally fast decisions. As a result, all such “what if” scenarios could now be done in the virtual world without affecting the real world.

Next to the challenges the telecommunications industry will face with the ever-growing volumes of usage data, software vendors like Neural Technologies need to provide solutions that are able to exchange data with any kind of connected system. Information exchange between systems will be key. and the usage of real time APIs will grow. Industry standards for these APIs, like those specified through the TM Forum Open APIs, will help to standardize the exchange of information which Neural Technologies fully supports already today.

With more data becoming available through the Internet of Things and 5G in the future, operators need to prepare themselves to leverage this data. Data is every operator’s asset, and using AI and ML, these assets can be mobilized to enable CSPs to strengthen current revenue sources by creating entirely new revenue streams. Ready to help CSPs achieve these goals, Neural Technologies’ state-of-the-art digital transformation and analytical technologies can help CSPs leverage this data and create new revenue streams.

Bio

Martin Laesch joined Neural Technologies in October 2015 as Senior Vice President of Professional Services and is now the Chief Technology Officer. Martin is responsible for the global Strategy and Products, Solutions development as well as the Consultancy Services to customers. Martin has more than 20 years’ experience in telecommunications services and the software industry, filling roles from Project Manager to Managing Director. Martin joined Neural Technologies in October 2015 by acquisition of Enterest GmbH, which he co-founded in 2003. Martin holds a Master of Computer Science degree.

 

IoT’s Impact on the Data Center and the Role of Intelligent Power

By Marc Cram, Director of New Market Development, Server Technology

Once dubbed the next Industrial Revolution, the Internet of Things (IoT) has proven to be the movement that will drive the evolution of network, IT, and data center design into the future. To sum up the net impact of all of the new devices situated at the edge of all of the networks, consider this: there will be some 24 billion Internet of Things devices online by the end of 2020, which is actually more than double the 10 billion devices that will be used directly by people. Intelligent PDUs will play a critical role in the management of networks that support that traffic.

In fact, IoT has had a number of impacts on data center infrastructure, as well as data center services. Not only has IoT driven the creation of more robust networks and IT systems, it has also pushed the boundaries of what was previously understood as cloud and edge computing, and the networks that support those systems.

Lean and mean

When we look at the impact of IoT on data center infrastructure, the greatest tangible effect has been on data center networks. Most facilities have had to adapt in order to keep up with IoT—especially 5G IoT. This has meant an increase in the number of connections and in the overall speed of networks in most deployments, even ones that lean heavily on edge computing. Those edge devices still need to push data back to a central hub for more detailed computing and analysis.

Because of this, the majority of data centers are upping their networking and connectivity game. Another key impact IoT brings to data centers is a different type of capacity demand. IoT devices are continually running and delivering data, meaning that many data centers now have a much smaller window than before to take a network offline or make adjustments. Traditional maintenance windows are now closed, and network architectures have to be adapted to support uptime. The impact on data center infrastructure? It needs to be equally flexible.

More secure

An unexpected impact of IoT on data centers has been the need for an increased security presence at the edge. This new security challenge is the unwanted passenger on the train of network safety. It is the result of having more passengers on the new IoT touchpoints and endpoints.

This increase in the number of devices has presented a unique challenge for those in charge of their company’s networks. The proliferation of traffic has meant that companies are investing in new tools to monitor and manage traffic on their networks. While these tools are mostly in the form of software and IT appliances, there has also been an increase in the adoption of network PDUs.

Everything needs power

While they may seem like an unlikely player in new IoT data center infrastructures, intelligent PDUs are serving a key role in securing networks, supporting uptime, monitoring traffic, and managing systems.

Switched PDUs are the gatekeepers of all the power that is fed to the rack. After all, everything needs power, right? Not only is the rack PDU the bridge between the data center’s entire electrical infrastructure and the devices that run the network, it also provides the nearest touchpoint to monitor and manage that power. Talk about up close and personal!

Monitoring the edge

IoT computing demands more sophisticated monitoring solutions at the rack and PDU level. By definition, edge compute sites are not adjacent to the core data center facility. Lack of proximity means that there is an increased reliance on the ability to monitor power and cooling conditions remotely, as well as the ability to remotely control and reboot single outlets. As IoT has pushed monitoring to the distant reaches of the network, intelligent PDUs have likewise been deployed to provide feedback and control.

Monitoring the core

Intelligent PDUs arguably play a more critical role at the core, thanks to IoT. They provide information about equipment operation by metering the input and output power at the PDU. They also provide remote control operations that allow you to turn power on and off to individual receptacles. Having a network connection allows the data center manager to enable or disable outlets from a remote location or within the facility itself. As IoT has required more flexibility and fewer maintenance windows, intelligent PDUs have stepped in to assist with controlling the computing environment.

Monitoring to manage

Increased data traffic and shifting workloads increase the complexity of the data center manager’s power and cooling resources within the facility. By using intelligent PDUs, you can access real-time usage data and environmental alerts. All power usage data is easily tracked, stored, and exported into reports using intelligent PDUs and DCIM software. By analyzing accurate power usage information at the cabinet level, data center managers are now able to more accurately shift power resources within the white space.

In short, an intelligent PDU can be the control your data center infrastructure needs to support IoT applications. This is increasingly important as this infrastructure is being pushed closer to the edge with even less time for maintenance. Higher device demand comes with higher power demands, which means more challenges to the network. PDUs help you meet them and anticipate the next IoT evolution.

Marc Cram is Director of New Market Development for Server Technology, a brand of Legrand (@Legrand). A technology evangelist, he is driven by a passion to deliver a positive power experience for the data center owner/operator. He earned a bachelor’s degree in electrical engineering from Rice University and has more than 30 years of experience in the field of electronics. Follow him on LinkedIn or @ServerTechInc on Twitter.

Ad Fraud – The Fastest Growing Cyber-Crime You’ve Never Heard Of

By Daniel Avital, Chief Strategy Officer, CHEQ

When we think about industries most in need of cybersecurity protection, we tend to think of government, financial services, transportation, and healthcare. All of these are natural targets for data theft, espionage, financial fraud, and even terrorism. So naturally, when people hear that we work in cybersecurity for online advertising, they’re taken by surprise. “Cybersecurity for online advertising?” Huh, I didn’t realize that was a thing…”

Online ad-fraud | One of the fastest growing cyber-crimes out there

When we first took CHEQ to market, it was in response to what was quickly becoming one of the fastest-growing criminal enterprises in the world: ad fraud. What is ad fraud? Broadly speaking, it’s the attempt to generate fake online ad views, clicks, and engagement for monetary gain. These can be the result of small-time, one-man-show operations running simplistic bots, using proxy servers or contracting click-farms, or what is generally categorized as GIVT (General Invalid Traffic).

However, over the years we’ve witnessed a rapid growth in SIVT (Sophisticated Invalid Traffic), leveraging large-scale, complex bot-nets. These bots are designed specifically around online advertising ecosystems and infrastructure, looking to exploit weaknesses in the programmatic supply chain, publisher sites, adtech intermediary platforms, paid-social, and paid search platforms.

To understand how quickly this criminal enterprise is growing, consider that in 2017, aggregated industry data showed ad-fraud to be costing advertisers approximately $6.5 billion in wasted ad spend. Last year, new data suggested that this figure has risen to as high as $30 billion. This marks an almost 500% increase in just two years. Industry projections for 2021 are now talking about a figure north of $50 billion. That is absolutely astonishing growth, by any benchmark. Recent research into click fraud conducted by fraud expert Professor Roberto Cavazos of the University of Baltimore suggests even greater exposure to fake traffic on PPC advertising channels, now rivaling programmatic channels for fraud.

Why is ad fraud growing so rapidly? Because it’s easy to do and hard to catch.

When you think about what encourages or discourages someone from engaging in criminal activity, one key factor is the risk-reward element. Now consider ad fraud. The reward element is also extremely high, as there is little-to-no overhead required. Any hacker can do this from the comfort of their bedroom. On the other hand, the risk is extremely low. Your activity might be detected and blocked at some point, but the chances of the perpetrator actually getting named and caught are remarkably slim – not to mention that even if they were caught, the ability to prosecute such a person is very unclear, with questions of jurisdictions and varying internet laws among different countries.

But why is it more appealing than other cyber-crimes? Because up until recently, there’s been little cyber-driven ad security.

It is of course true to note that a good balance of risk and reward exists not only in ad fraud, but across most major cyber-criminal enterprises. And yet, ad fraud still stands out in its remarkable growth. The best explanation is that until recently, online ad security has been in its infancy, mitigated primarily by adtech companies, often referred to as “ad-verification” vendors.

These ad-verification vendors sprung up from within the adtech scene with capabilities better suited for measuring clicks and impressions, attributing users, and monitoring campaign performance. As they lacked a cybersecurity background, they relied primarily on IP blacklists to filter fake traffic, a methodology widely regarded as ineffective by anyone in the bot mitigation industry. These IP blacklists are purchased from third-party vendors, they age quickly, and they cover a very small portion of the fake traffic out there. So, if you’re a cyber-criminal, where would you rather turn your efforts? Toward governments and financial institutions who are deploying the most sophisticated, real-time, deterministic security? Or would you go for an industry that’s largely unprotected, which relies on dated methodologies? It seems many hackers and fraudsters are opting for the latter and choosing the easiest target available.

Advertisers turn to cybersecurity for a solution

With advertisers suffering severe losses and spending millions of dollars on ad-verification vendors who are incapable of mitigating the problem, large brands and media agencies have turned to the cybersecurity industry for solutions. This is how we at CHEQ, a cybersecurity bot-mitigation company, unexpectedly found ourselves working with large-scale online advertisers and brands. So, we were surprised when we first went to market just over three years ago to find just how sophisticated ad-fraud had become and how savvy the fraudsters were when it came to exploiting the inherent weaknesses of the adtech ecosystem.

The ecosystem is so convoluted, so riddled with intermediaries, dodgy exchanges, murky networks, and shady business models, that it’s no wonder it became a prime target. What was great to see, however, was just how receptive the adtech and online-advertising community was to adopting cybersecurity solutions, implementing JavaScript codes to run browser challenges, and getting smart about ad fraud and click fraud prevention.

Today, the world’s leading advertisers are already deploying solutions like CHEQ across all their different channels, from programmatic display and video, to paid search, paid social, OTT, publisher sites, content recommendation, and even 3D console gaming. The category of online ad security is now booming, as big-brand advertisers race to protect their billions of ad-spend dollars from the growing threat of ad fraud and click fraud. This makes online ad security a truly exciting new category, with a huge and largely untapped addressable market.

Bio

Daniel serves as Chief Strategy Officer at CHEQ, leading the company’s positioning and marketing efforts. Prior to CHEQ, Daniel served as Senior Director of Strategy at WPP’s Grey, leading brand strategy for some of the world’s leading advertisers including P&G, GSK, and Fiat Group.

Customizable Cloud-Computing Ensures Successful Commercial Drone Missions

By Barry Alexander, Founder and CEO, Aquiline Drones

Although awareness of and appreciation for commercial drone systems is growing, many businesses remain unaware of the opportunities drones offer to achieve better business results, help streamline business solutions, and elevate profitability. Drones are unique aerial vehicles and are ideal for providing crucial aerial perspectives to assess emergency situations like the recent Australian wildfires, and for delivering critical medical supplies to those in need. Drones are even being used to deliver information to the public, as in the current coronavirus pandemic.

However, most businesses do not realize the intrinsic benefit of integrating drones into their day-to-day operations, whether it be for asset inspection and management, perimeter security, precision farming, aerial ranching, video production, or surveying and mapping. The list continues! But a point of note is this: A drone is just mechanical hardware unless used optimally to gather information. Such reconnaissance activity allows users to capture, analyze, manage, model, and share data insights – usually in real-time. This level of application calls for a robust computing platform that supports complex drone operations and the footage they generate. This is facilitated with cloud computing technology.

According to a recent survey by RedLock, only 7% of businesses firmly believe they have decent visibility over all important company information from drone usage in a well-structured and well-secured enterprise cloud. To address their inadequacies, companies are now seeking out unique, customizable, technical platforms such as the AD Cloud. These platforms offer everything involved in completing commercial drone operations in one centralized setting. The AD Cloud in particular provides a variety of salient features ideal for building highly customizable and large-scale solutions.

Building a Cloud from the Ground Up

Core features and services offered by some of the nation’s most notable cloud companies that have mastered and integrated artificial intelligence (AI) and the Internet-of-Things (IoT) include:

  • Modularity – Scalability for high-density drone operations across industries requires a modular cloud design, in which services can be added a la carte, allowing businesses to start small, then scale up as needed.
  • Unmanned Aerial Vehicle (UAV) Specific – It is important for cloud environments to cater to the industries for which they are being used. Specialized cloud platforms such as the AD Cloud provide algorithms for UAV operations, manufacturing, and maintenance, making the AD Cloud more valuable and more desirable for businesses that want to integrate UAVs into their operations.
  • Aviation Compliance – Drones are aircraft. Accordingly, they must operate and should be held to the same or similar standards as manned aircraft. These standards should be established and regulated by the Federal Aviation Administration (FAA) or the International Civil Aviation Organization (ICAO). A drone-specific cloud should maintain built-in compliance rules to ensure that connected devices remain safe and compliant with regulations and the law.
  • True Autonomy – Allows for autonomous UAV operations with plug-and-play mission capabilities.
  • Data Insights – Specialized algorithms can be created for flight control, traffic management, enhanced awareness, terrain modeling, and image recognition, along with specific additions for more sophisticated scenarios.
  • Full Lifecycle Governance – This includes providing connectivity and insights across the drone lifecycle – from product development, to manufacturing, to UAV operations and MRO – resulting in greater efficiencies and reduced downtime.
  • Dynamic Dashboard – A full-capability digital dashboard accessible on any device delivers a comprehensive, standardized, and flexible user experience (UX) with the power of the cloud at one’s fingertips. Users can plan, collaborate, and execute missions, livestream data and video, and obtain real-time data insights – all from within a single and customizable enterprise asset management (EAM) system.

Further, a comprehensive cloud system such as AD Cloud can also aggregate data, which enables companies to make statistical forecasts and logical inferences for future resource planning and allocations.

A Bright and Lofty Future

Despite its extreme growth within the past decade, the global cloud computing market is forecast to exceed $623 billion by 2023 as 80% of organizations – many using drone technology – migrate to the cloud by 2025.

One key projection is that cloud computing will change the hardware architecture of drones by simplifying these flying robots. With low latency, higher bandwidth, and a highly reliable connection to the cloud, a drone only needs to carry sensors, without requiring any additional power.

Drones and edge computing technology will continue to grow exponentially, allowing for more resolution, more sensor types, and more flight capabilities, while supporting demand for higher frequency and more data. In fact, drone fleets and swarms will have the ability to launch from edge computing hubs to further automate the process.

Another major highlight will be the quick creation and activation of a comprehensive cloud computing-drone infrastructure as directed and overseen by the FAA, the regulatory agency for all UAVs – ensuring safety remains paramount.

Lastly, the recent introduction of a bipartisan bill in Congress entitled, The American Security Drone Act of 2019 essentially bans the use of foreign drones – mainly Chinese drones – and other unmanned aerial systems that have been purchased with federal dollars.

The drone industry continues to gain in purpose and popularity, empowering companies that use them with powerful, customized cloud computing capabilities. Cloud-enabled drone technology increases these companies’ operating efficiency, efficacy, safety, and ultimately, their bottom line. As more of these cloud computer-connected devices take to the sky, we’ll see a world that is truly interconnected within the technological atmosphere.

Bio

A veteran pilot, serial entrepreneur, and visionary leader, Barry Alexander is founder and CEO of Aquiline Drones, a full-service, US-based commercial drone company that boasts an integrated manufacturing and supply chain, world-class MRO services, and real-time data insights to improve ROI across a variety of industries. Barry’s ultimate goal is to revolutionize the entire American drone market through innovative technology and key community and governmental partnerships to create a world in which humans and drones live and operate in harmony for the betterment of society.

Five Essential Web Security Functions to Protect Small Businesses and Boost ARPU

By Michael Fowler, President of Partners and Channels, Sectigo

When cybercriminals attack, they aren’t just targeting large corporations. In fact, according to the most recent Verizon Data Breach Investigations Report (DBIR), 43% of cyberattacks target small businesses—a stat that might surprise many business owners. That same report indicated that just 14% of small businesses are adequately prepared to protect themselves from those attacks, shining a light on the growing need for cybersecurity within these organizations.

While small business data breaches may not dominate the headlines the way a major financial institution being hacked likely would, the average total cost of a data breach is now $3.92 million—a number that rises to $8.19 million when limited to US breaches. And while large corporations may be able to weather that damage comfortably, it’s an amount likely to sink many smaller companies.

As the volume of attacks targeting small businesses continues to rise, these businesses must not only understand the array of website security tools needed to protect and back up their sites, but also learn to use those tools more effectively. Small businesses seeking to protect themselves against today’s most dangerous threats should ensure that they have the following five key cybersecurity capabilities.

The Five Essential Cybersecurity Capabilities

  1. TLS/SSL Certificates. “Identity” is a critically important concept—especially online. Customers arriving at a small business website need to have confidence that they are in the right place. Web certificates (visible as a padlock in browsers) serve to indicate to customers that the site they are visiting is secure and that information they enter—personal, financial, or otherwise—is being shared with an authentic/verified business rather than a fraudulent site.

    In the past, some small businesses have resisted SSL certificates because of the hassle to maintain them—after all, we’ve seen what can happen when certificates lapse. Fortunately, that is no longer the case. The rise of automation has made it considerably easier to issue, renew, and maintain web certificates, meaning that small businesses can enjoy the benefits of identity security with minimal management.
  2. Malware and Vulnerability Detection. “Detection” means more than just alerting you when something has already gone wrong. Small business owners must be vigilant for potential website vulnerabilities and address them before they can impact their businesses. Search engines will blacklist a website with known vulnerabilities, making it critical for website owners to be proactive about detecting potential issues.

    It’s also important to be aware of potential vulnerabilities with your site’s various components. For instance, your website’s content management system (CMS) or e-commerce platform may have known vulnerabilities that require steps to address. There are simple security products available today that can help monitor your website and alert you to these potential issues.
  3. Remediation. Once a threat has been detected, the next step is remediation—removing the threat from the system. When exploring website security technology, identifying a product with the right remediation capabilities for your site is important. Look for something capable of removing active infections from your website files, MySQL database, and other important components of your website.

    It’s also important that remediation is completed without disrupting functionality. You don’t want your website being taken down for maintenance every time a potential threat is detected. Fortunately, today’s remediation products are generally mindful of this essential continuity.
  4. Patching. Once a threat has been detected and successfully eradicated, it’s time to make sure that you eliminate the vulnerability that allowed the malware into your system. Installing technology that proactively patches known vulnerabilities before they can be exploited by cybercriminals minimizes the amount of time attackers can potentially exploit vulnerabilities. Having web security tools in place that can automatically scan for new patches and ensure that they are installed quickly can go a long way toward protecting your website from outside threats.

    For example, businesses running a WordPress, Drupal, or Joomla site gain real-time threat protection by arming their website, blog, or online shop with automated CMS patching, preventing the bad guys from sneaking in between updates—and stopping zero-day attacks in their tracks.
  1. Back up and Restore. It’s important to know that even if a threat slips through the cracks and does real damage to your website, the site can be easily restored. Version control software that enables businesses to back up and restore their website with just the click of a button is now widely available to those who recognize the importance of this technology. Many tools will even automatically create backups at certain intervals, making life as simple as possible for business owners.

    It’s difficult to overstate the value of effective backup and restore tools. They ensure that even in a worst-case scenario where an attack cripples your entire website, you remain just one click away from restoring what was lost. That peace of mind enables small business owners to focus on the hundreds of other things they need to worry about, secure in the knowledge that their website is in good hands.

CAPTION: Five simple, automated cybersecurity technologies enable website owners to achieve big-business web security and peace of mind, using small-business resources.

Strong Website Protections Help Small Businesses and Their Customers

As breaches become both more numerous and more costly, small businesses must avoid putting themselves in a vulnerable position. Fortunately, as the threat landscape evolves, TLS/SSL and hosting providers are evolving as well. It is now easier than ever to protect online assets from malware and data breaches with new suites of products capable of everything from automatic certificate renewal to patching and remediation.

As simplified website security tools become more widely available, small businesses are increasingly able to enjoy a level of protection on par with much larger companies. The rise of automation has put powerful detection, protection, and recovery tools in the hands of the most resource-constrained and vulnerable organizations. And while no one tool can protect against every possible threat, SMBs with effective web certificate, threat detection, remediation, patching, and backup and restore capabilities will find themselves well-positioned to face whatever threats the future may hold.

Bio

As President of Partners and Channels, Michael Fowler is responsible for developing and maintaining channel partnerships with leaders in key growth markets. Michael has more than 15 years of experience in web security and works closely with Sectigo product management, engineering, marketing and support to develop product refinement and go-to-market strategies.

How IoT Adaptation in 2020 Will Boost Manufacturing Profit, Not Destroy Jobs

By Darren Sadana, CEO, Choice IoT

IoT platform spending is expected to increase at a compound annual growth rate of 40 percent over the next few years, from $1.67 billion in 2018 to $12.44 billion in 2024, globally. Industry 4.0, the fourth industrial revolution, is just now getting underway and will transform conventional manufacturing methods like never before.

More intelligent sensors are gathering and transferring larger amounts of data at faster speeds, and are now capable of making decisions on the spot. Their agility makes them ideal substitutes for the large software-driven manufacturing execution systems currently in place.

These valuable changes active in manufacturing plants, as a part of Industry 4.0, are substantially boosting manufacturing ROI instead of putting people out of work.

Presently, the bulk of manufacturers considering the use of industrial IoT are focused on assessing what digital infrastructures need to be in place to ensure Industry 4.0 takes off smoothly.

Industry 4.0 comprises the “Internet of Things” (IoT) and smart manufacturing, marrying conventional operations of standard manufacturing with smart digital technology. The basic technologies that fall under 4.0 are artificial intelligence (AI), 3D printing (additive manufacturing) and blockchain.

The result? A better, faster ecosystem for companies to process supply chain management in real-time.

How Manufacturers Can Make the Most Out of IoT

Fundamental technologies like AI and blockchain rely on constant access to each other and to the cloud. This process depends on 24/7/365 wireless connectivity, which confers to manufacturers an essential commodity—the ability to obtain accurate, current information, including pricing and contract requirements.

Deleting a primitive central application that determines output, real-time data will now require an explosive increase in data analysis, making it absolutely imperative to create actions around the data gathered.

Still, speed will not hamper accuracy as the intelligence of these machines grows, and only those manufacturers who can make to stock, order and assemble-to-order will win.

In addition to increased efficiency, predictive maintenance will become conventional. No longer will idle time associated with repairs be a concern, because with IoT adaptation, sensors will monitor and analyze multiple signals and alert operators to machines that require servicing.

Since most US manufacturing plants are at least 20 years old, their in-house machines are not equipped to operate in an Industry 4.0 environment (and are also much more prone to breakdowns).

These breakdowns account for up to $50 billion per year in lost manufacturing time, something IoT adaptation can mitigate.

With discrete manufacturing and predictive maintenance, IoT will contribute to the safety of workers by directing them to an emergency evacuation, safeguarding them from serious accidents. With new and improved inventory and equipment tracking, thousands of man hours will be also saved, and businesses will see an expansion in profit without an inflation in data costs.

Industry 4.0 and The Age of 5G Manufacturing: Creating More Jobs

Despite saving man hours, more jobs will be created—those “saved” man hours will be put to use creating jobs that have to manufacture new and smarter devices. With 5G’s multi-trillion-dollar rollout, the number of new jobs predicted for manufacturing alone will triple, according to the World Economic Forum via Forbes.

“Machines and algorithms in the workplace are expected to create 133 million new roles but cause 75 million jobs to be displaced by 2022.” Today, businesses are waking up to the new staffing and organizational demands of IoT.

At the industrial level, IoT will increase the use of robotics, automation, and analytics, creating a higher demand for cognitive occupations, increasing productivity, and producing a more engaging work experience.

As with every industrial revolution prior to Industry 4.0, there will be a net increase in jobs. Technology always opens new opportunities, so we must account for potential job mutations that IoT adaption will continuously bring.

Just like the Amazons of the world emerged after the crash of the dotcom bubble, only made possible by higher internet speeds and faster data transmission, a new era of creative destruction is now on the rise—paving the way for the Amazons and Facebooks of tomorrow.

Healthcare, hospitality, transportation and numerous additional industries will be radically transformed, but only companies taking advantage of IoT will thrive, along with employees who commit to knowledge-intensive sectors.

Vending machines will be gamified so consumers can engage with interactive games to win prizes and create loyalty. Retina scanners can be installed to read customer reactions to gain insight into how they react to different packaging, messaging, colors and games.

IoT will impact additional industries such as healthcare as well, with similar gamification or new devices. Leveraging IoT will allow healthcare providers to make better, faster care decisions including, but not limited to, smart pills, robotics, and Real-Time Health Systems (RTHS).

In the security industry as well, the use of IoT will spike.

Preparing for Innovation while Controlling Costs: 5G Management

Industry 4.0 will transform plants into digital powerhouses, especially as 5G becomes a non-negotiable utility for consumers and dependency on IoT concepts increases.

5G will transmit data faster—which in turn will cause some devices, for various reasons, to work overtime. This will increase costs, and companies won’t realize this until they get the carrier bills at the end of the month and see that the devices went over their allotted MBs.

This makes having the right IoT wireless connectivity partner critical. Platforms must be able to deliver huge amounts of data down to the session level, and be capable of deploying thousands of devices at a time with error-free provisioning.

With 5G Wireless connectivity happening right now, ChoiceIoT is a master agent for T-Mobile, which has already launched 5G in more than 200 cities. Choice IoT can provide the technology and guidance for solution providers to transfer or build their solutions on the network of the future.

As of now, only 14 percent of machines in current US manufacturing plants are equipped to transmit and receive real-time data. Those companies that do not adapt to IoT will compete with existing solutions in the marketplace and become obsolete very quickly.

Solutions providers need to stay ahead of the curve with R&D to make sure they are competitive in the 5G landscape of tomorrow. As they develop solutions that can take advantage of 5G, they can also increase their profitability and relevance in the marketplace.

Due to intense competition among the wireless carriers, 5G costs are not predicted to increase. There are also no cost increases due to carrier competition or the data itself, but the opportunity for expansion of data and overload, and increased number of devices utilizing the technology will result from dramatic increases in speed.

The key to controlling data costs is to get data scientists to evaluate the data and see what data solutions bring cost savings. With Edge Computing, for example, smarter devices and sensors will reduce the need for data to be brought to a central cloud to be analyzed.

Along with data monitoring, using alerts and analysis forms a good connectivity platform that can help businesses see where the data leakage is happening and if that data is relevant.

Most importantly, drilled-down usage by session level is key to good data analysis. This can help identify rogue software and assist solution providers in minimizing data connectivity and storage costs as 5G rapidly approaches.

For example, the vending machine, a common and universally manufactured product, is closely tied to micro markets. This automated, self-checkout technology operates unattended, keeping labor costs down—but would not be possible without a connectivity partner with a real-time IoTSaaS or IoTPaaS to help control devices at the platform level by receiving notifications of rogue or stolen devices incurring roaming charges. Without such partners, these micro markets could lose the momentum gained by heavy carrier charges.

The Revolutionary Potential

This fourth industrial revolution was precipitated by the transition from handwork, to machinery (in the late 18th century), to computerization (that began in 1950). This is a very exciting time, as devices are able to communicate with each other and make decisions without the data having to pass through a central server. This greatly increases the scope and possibilities of new solutions being developed in the marketplace, and consumers will benefit from a better quality of life.

With IoT and artificial intelligence (AI) converging to form a powerhouse of smart manufacturing, there is no pause in its arrival. Experts also predict that the total bill for the 5G rollout, globally, will exceed $2.7 trillion by the end of 2020.

Bio

Darren Sadana, CEO of Choice IoT, Master Agency for T-Mobile. T-Mobile has already launched 5G in over 200 cities—and Sadana’s first-in-industry IoTSaaS is providing the new era of wireless connectivity control of costs for millions of Internet of Things devices within new smart cities as well as the transportation, healthcare, manufacturing, security, retail, hospitality, engineering & energy industries in the US and globally.

Unmanned Edge Operations Are the Future

By Michael C. Skurla, Chief Technology Officer, BitBox USA

The growth of edge is an interesting phenomenon. The rise of edge computing closed the IT infrastructure gap with edge data center deployments. The rise of public cloud and centralized computing paved the way to hybrid cloud and decentralized computing. However, within a distributed infrastructure, the IT ecosystem demands a mix of telecom and web services.

Whether on-premise, or closer to end-users, edge computing complements the current public cloud or colocation deployments.

The increased demand for connectivity-driving data proliferation positions IoT’s critical role as an edge enabler. But adding more “client” devices to networks isn’t the only role of IoT within an edge ecosystem. The often-overlooked side is for the required IoT technology to enable edge operations.

While cloud computing shifted the data center to a third-party network operations center (NOC), it didn’t eliminate on-premise data center operators who manage and respond to facility problems. Edge introduced a new challenge to network operations: autonomous management with limited access to the individuals who are local to equipment to address problems or perform maintenance. The new norm does not have in-house IT staff, equipment and machines under one or several roofs. It distributes data center operations into thousands of smaller facilities, most of which are not readily accessible in a short drive or walk.

Describing the edge as, “the infrastructure topology that supports the IoT applications,” Jeffrey Fidacaro, Senior Analyst for 451 Research Data Centers, underscores the importance of building a “unified edge/IoT strategy” that taps into multiple infrastructure options to manage the onslaught of IoT and facility systems while dealing with the needs of constant change.

Interestingly, the platforms around IoT solutions, not the hardware itself, are the answer to this quandary. Based on IT standards, IoT sensing and monitoring hardware offers granular, a la carte-style monitoring solutions. These solutions are often easy-to-install, flexible form-factor hardware packages that equip small sites, from shelters down to small electrical enclosures. Since these devices offer a multitude of functions and data points, they make reliable and remote facility management possible.

For instance, the sensing technology of ServersCheck allows granular site data to be generated from hardware, which complements an IoT platform that allows large amounts of sites to be monitored in concert while also tying in more complex control sub-systems such as HVAC, generators, access control, and surveillance equipment. These IoT platforms expand monitoring and remote management to a global scale, allowing customized alarming, reporting, dashboarding, and more, for a geographically distributed portfolio of locations.

This style of IoT management solution allows a flexible, customized design for each site. Its scalable infrastructure reduces the need for NOCs to monitor multiple separate software packages to determine conditions at each site. This facilitates rapid remote diagnostics and a triage of problems before dispatching staff to remedy issues.

Edging to Cellular Levels

Telecommunications keeps pushing further to the edge. In particular, remote monitoring is more crucial than ever, with the planned 5G rollout that ensures rapid growth of small-cell technology piggybacking on shared infrastructure such as streetlights, utility poles, and existing buildings.

As wireless transmitters and receivers, small-cell technology design allows network coverage to smaller sites and areas. Compared to the tall cell towers enabling strong network signals across vast distances, small cells are ideal for improving the cellular connectivity of end-users in densely developed areas. They play a crucial role in addressing increased data demands in centralized locations.

The rapid scalability of small cell technology can not only meet the demands of 4G networks, but can also easily adapt to 5G rollouts to expedite connectivity functions closer to the end-users. In clustered areas, small-cell technology allows for far superior connectivity, penetrating dense areas, and in-building sites.

Consider small-cell technology as the backbone of the fourth industrial revolution. Enabling the transmission of signals for transmitting even greater amounts of data at higher speeds, small-cell technology empowers IoT devices to receive and transmit far greater amounts of data. It also enables 5G technology, given the density requirements of the technology.

Enterprises face a flood of data from IoT connectivity. In fact, Cisco estimates this data flood to reach 850 zettabytes by 2021. This is driving edge buildouts of all sizes and shapes. To accomplish this, edge operators must rethink how they manage and monitor this explosion of sites. IoT platforms have proven to have the scalability and flexibility to take on this challenge in a highly affordable way.

As Forrester research predicted, “the variety of IoT software platforms has continued to grow and evolve to complement the cloud giants’ foundation IoT capabilities rather than compete with them” and it expects the IoT market to continue to see dramatic and rapid change in coming years.

It’s time for the technology that edge is being built to support – IoT – to play a role in managing the critical infrastructure that enables it. IoT platforms can tie the knot for this marriage.

Bio

Michael C. Skurla is the Chief Technology Officer for BitBox USA, providers of the BitBox IoT platform for multi-site, distributed facilities’ operational intelligence, based in Nashville, Tennessee. Mike’s in-depth industry knowledge in control automation and IoT product design sets cutting-edge product strategy for the company’s award-winning IoT platform.

2020 Cloud Market Predictions: The Future Looks Bright

2020 Cloud Market Predictions: The Future Looks Bright

By Mark Kirstein, Vice President, Products at BitTitan

For many people, the New Year is a time for reflection on the year gone by and an opportunity for renewed commitment to progress and goals. The same is true for businesses. As we embark on a new year and a new decade, many businesses are trying to anticipate where the market is headed so they can make strategic plans that will result in success.

Many things could influence market conditions around the world this year, from the 2020 Olympics in Tokyo, to the U.S. – China trade war, to the U.S. presidential election. While the U.S. surplus in exported services is shrinking overall, this trend is not expected to have a negative impact on the cloud services sector. Read on for our top six predictions for the cloud market in 2020.

  1. SaaS growth will continue

Currently, the cloud is a $200 billion market, yet overall IT spending is in the trillions of dollars. This means that spending for on-premises (on-prem) software and services remains strong. Is this a bad sign for the cloud market? Absolutely not. We anticipate the global cloud services market for 2020 to continue to grow in excess of 20 percent. Many organizations are moving to the cloud in stages and there are several factors that will keep migration in forward motion. These include increased confidence in and reliance on cloud services, the phase-out of on-prem software like Microsoft Exchange 2010, and continued aging of hardware and infrastructure. While we expect most companies to make conservative spending decisions in 2020, decisions related to the cloud are fundamental to operations, particularly for global companies, and not as likely to be put on the back burner. We will see continued innovation of SaaS services and offerings, coupled with organizations migrating closer to an “all-in” adoption of the cloud. There is a lot of opportunity ahead for SaaS.

  1. Cloud-to-cloud migrations will continue to rise

While companies are continuing to migrate from on-prem to the cloud, we expect to see a continued uptick in cloud-to-cloud migrations as more companies devote attention to optimizing their cloud footprint. Currently, a majority of BitTitan’s business is cloud-to-cloud migrations. The historical concerns of cloud security, reliability, quality, and SaaS-feature parity have largely been addressed, but companies are continually searching for the provider that can deliver the most value for their IT dollars. Businesses want the ability to move their data while avoiding the perils of vendor lock-in. Furthermore, maintaining a multi-cloud environment allows companies to better manage business risks.

  1. The use of containers will increase

Containerization, which packages up software code and all its dependencies so the application runs quickly and reliably and can be moved from one computing environment to another, has achieved mainstream adoption and will continue to be a strong market segment in 2020. Containers offer a great deal of flexibility and reduce the risks for companies moving to the cloud. They reduce infrastructure costs, accelerate and simplify the development process, result in higher quality and reliability, and reduce complexity for deployments. Containers also aid in cloud-to-cloud migrations. Businesses that use containers can easily run them on Google Cloud today and switch to other platforms like Azure or Amazon Web Services (AWS) tomorrow without complex reconfiguration and testing. This allows businesses the freedom to shop for the right cloud environment. This is one of the reasons the container market is growing at a rate of more than 40 percent, and we expect that growth will continue.

  1. Microsoft and Google will seize market share from AWS

Of the top three public cloud providers today, AWS was first to market and has enjoyed a considerable lead in market share. AWS has been particularly appealing for companies that want to provide “born in the cloud” services. But in 2020, we expect the two other top public cloud vendors – Microsoft Azure and Google Cloud – to make significant inroads and take market share away from AWS. Part of this is simple math: With such a big slice of the market, it will be hard for AWS to maintain its rate of growth. And the competition is getting stiffer. Microsoft is doing a great job of appealing to enterprises who are grappling with legacy infrastructure. Google also is making significant investments in its cloud computing unit. Its technology is already very good and easy to use, which will make Google a force to be reckoned with. Another trend we are likely to see is that smaller public cloud vendors will drop out or choose to focus their business on the private cloud infrastructure market, where they are more likely to excel.

  1. The market will expand and consolidate

As the cloud market grows, the ecosystem will expand with the types of solutions and capabilities to manage and streamline, increasing the value of investments in the cloud. On average, companies using cloud technologies are using five different cloud platforms. We will continue to see new and improved offerings to help companies assess, monitor, and manage their cloud footprints to reduce costs and improve security. As new, compelling cloud solutions enter the market, we are likely to see more consolidation, with Amazon, Microsoft and Google continuing to acquire new solutions to enhance their own offerings.

  1. 5G will usher in the next level of cloud adoption globally

Recently, Ericsson Mobility predicted that there will be 1 billion 5G subscriptions by 2023 and they’ll account for almost 20 percent of the entire global mobile data traffic.[1] Besides the massive increase in speed provided by 5G technology, it also comes with a remarkable decrease in latency. While 3G networks had a latency of nearly 100 milliseconds, that of 4G is about 30 milliseconds, and the latency for 5G will be as low as 1 millisecond, which most people will perceive to be nearly instant. With this type of performance, we believe that cloud-based services will become more reliable and efficient. Not only that, but 5G may also accelerate cloud adoption in countries that are lacking wired infrastructure today.

Without a crystal ball, there is no way to know for sure what the market landscape will look like in the coming months. But by analyzing recent trends and considering their implications for the future, companies can take a forward-looking approach that will position them to stay ahead of the curve and be ready to seize opportunity as it arises. This year is looking bright for the cloud.

Bio

Mark Kirstein is the vice president of products at BitTitan, leading product development and product management teams for the company’s SaaS solutions. Prior to BitTitan, Mark served as the senior director of product management for the mobile enterprise software division of Motorola Solutions, continuing in that capacity following its acquisition by Zebra Technologies in 2014. Mark has over two decades of experience overseeing product strategy, development, and go-to-market initiatives.

When not on the road coaching his daughter’s softball team, Mark enjoys spending time outdoors and rooting for the Boston Red Sox. He holds a bachelor’s degree in computer science from California Polytechnic State University.

[1]How 5G will Accelerate Cloud Business Investment,” Compare the Cloud.net. Retrieved December 17, 2019.

5 Must-Have Security Tips for Remote Desktop Users

By Jake Fellows, Associate Product Manager, Liquid Web

Windows servers are typically managed via the Remote Desktop Protocol (RDP). RDP is convenient and simple to set up, and Windows has native support on both the client and the server. RDP clients are also available for other operating systems, including Linux, macOS, and mobile operating systems, allowing administrators to connect from any machine.

But RDP has a significant drawback: It’s a prime target for hackers.

In late 2018, the FBI issued a warning that RDP was a vector for a large number of attacks, resulting in ransomware infections and data thefts from many businesses, including healthcare businesses.

In 2019, researchers discovered several critical vulnerabilities in RDP that impacted older versions of Windows. The BlueKeep vulnerability is a remote code execution vulnerability that allowed an unauthenticated user to connect to the RDP server and execute arbitrary code via malicious requests. Additional security vulnerabilities were discovered later in the year.

Businesses use RDP because it is the most convenient way to provide remote desktop services for Windows servers and desktops, but it’s a relatively old protocol that was not initially designed with modern security best practices in mind.

However, RDP can be made more secure with a few configuration changes and best practices.

Avoid Guessable Passwords

Windows servers are often compromised with dictionary attacks against RDP. Attackers know hundreds of thousands of the most commonly used passwords, and it’s trivial to script a bot that can make repeated login attempts until it discovers the correct credentials.

It isn’t just the usual suspects such as “123456” or “pa55word” that should be avoided. Any simple password you can think of is likely already in a dictionary culled from leaked password databases. It is also important to ensure that you don’t reuse passwords that you use elsewhere on the web.

If you are the only administrator who manages the server, be sure to generate a long and random password that attackers can’t guess. If other people access the server over RDP, consider using the built-in Password Policy system to implement policies that enforce minimum complexity and length requirements.

Update RDP Clients and Servers

Attacks against RDP frequently exploit vulnerabilities in outdated server and client software. Older versions of RDP also lacked the cryptographic protections of more modern versions. As we have already mentioned, it is not uncommon for serious vulnerabilities to be discovered in older versions of RDP.

When the BlueKeep vulnerability was discovered, Microsoft quickly released a patch that, if installed, would close the security hole. But clients and servers only benefit from that protection if they are regularly updated.

Windows Server can automatically update RDP via Microsoft Updates, but server administrators should verify that they are running the most recent version. Automatic updates can be turned off and many server administrators don’t like to risk the disruption that an automatic update might cause. It’s always worth checking to make sure your servers are running patched and secure versions of RDP.

Don’t forget to update third-party RDP clients, too.

Connect Over RD Gateway or a VPN

Using RDP over the internet without an SSL tunnel is dangerous. RDP encrypts traffic flowing between the client and the server, but it may be vulnerable to certain types of attack; plus, the RDP port is exposed to brute-force attacks and denial of service attacks. Because of the potential security risk of exposing an RDP server to the open internet, it’s a good idea to put it behind a gateway that provides better security.

An RD Gateway allows clients to create an SSL-protected tunnel before connecting to the RDP server. The RDP server only accepts connections from the gateway. It is not exposed to the open internet, limiting the attack surface and preventing attackers from directly targeting the server.

Connecting over a VPN is a reasonable alternative to an RD Gateway, but it is less secure and may introduce unacceptable latencies.

Restrict Connections Using the Windows Firewall

If you know which IPs will connect to your RDP server, you can use the firewall to restrict access so that IPs outside of that scope will be rejected. This can be achieved by adding IP addresses to the RDP section of the firewall’s inbound rules.

Change the Default RDP Port

The default RDP port is port 3389, and that’s where most brute-force attacks are directed. Changing the port is a straightforward way to reduce the number of bot attacks against your server.

To change the RDP port, adjust the following registry key to the new port number:

HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp\PortNumber

Changing the port is not a substitute for implementing the other security tips we’ve mentioned in this article. While it may be enough to confuse unsophisticated bots and inexperienced hackers, more knowledgeable and sophisticated attackers will have little trouble finding the new port, so changing the port is not sufficient to adequately protect RDP from attack.

It is possible to implement even stricter security strategies to protect your RDP server from attacks, including the addition of two-factor authentication. However, following the tips we have outlined here will be enough to keep your server safe from the vast majority of attacks.

Bio

Jake Fellows is an Associate Product Manager for Liquid Web’s Managed Hosting products and services. He has over ten years of experience involving several fields of the technology industry, including hosting, healthcare, and IT system architecture.