More companies are migrating their infrastructure to the cloud to take advantage of benefits like reliability, scalability, and lower costs, but cloud migration remains a complex task requiring careful consideration and planning. (For tips on planning a secure and frictionless migration, download our ebook.) Choosing the right security technology is just one of many considerations, but it’s a critically important one, particularly for organizations in sectors such as healthtech and healthcare. Not only do organizations in these areas need to protect large volumes of sensitive patient and institutional data, but they can also face serious penalties for violating privacy regulations.
When it comes to security, it is often advantageous to choose an integrated platform such as Threat Stack’s Cloud Security Platform® in order to gain visibility across all cloud providers, including hybrid cloud environments, in a single dashboard. Threat Stack also enables full stack security observability, which provides organizations with contextualized information from every part of the cloud throughout the entire software development lifecycle.
To learn more about the most common (and most costly) misconceptions companies have about security technology when considering cloud migration, we reached out to a panel of cloud security experts and asked them to answer this question:
“What’s the most costly misconception that companies are clinging to when it comes to security technology and cloud migration?”
Meet Our Panel of Cloud Security Experts:
Read on to learn what our experts had to say about the most costly security technology misconceptions related to cloud migration.
Anthony Buonaspina, BSEE, BSCS, CPACC is the CEO and founder of LI Tech Advisors. He has over 30 years’ experience in hardware design and software programming. Anthony specializes in computer and cloud networking and has collaborated with clients and third parties on creating custom software designs and is a web and accessibility compliance expert.
“The main misconception that companies cling to when it comes to security technology when moving to the cloud is…”
That they think the cloud is NOT secure. When, in fact, cloud servers are more secure than an on-premises server.
This should be a no-brainer since many companies cannot afford the level of security that is necessary to keep their data and clients’ data safe. When you move to the cloud, you are gaining the combined resources of a much larger organization that can afford to put in place the highest levels of security that would otherwise be unaffordable for an individual company. You are also offloading the responsibility of maintaining the latest hardware, firmware, and software updates to a company that has the resources to perform these necessary and timely tasks.
Many small and medium-sized companies that don’t have the time or expertise to manage these tasks will benefit from having a cloud-based environment where they will be able to focus on increasing their business. Energy flows where attention goes, and you want your attention on the growth of your business and not having to worry about installing the latest security updates.
Don Baham is the President at Kraft Technology Group, LLC. Don has more than 17 years’ experience in information technology with a blended background in technology consulting and architecture, information security, and business development. He was most recently employed by D+H U.S. Operation.
“Buying the next generation firewall or advanced endpoint solution does not…”
Make your business any more secure than it was the day before you made the purchase. Companies with large security budgets are still breached. Businesses need to make sure they have the basic cybersecurity hygiene elements in place first. After those practices are established, make the investment into advanced security tools. Allow your security team or MSSP to properly implement the tools and tune them for your environment so it is easier to identify the exceptions to normal network or user behavior.
Gabe Turner is Director of Content at Security Baron, a website focused on cybersecurity, home security, and the smart home.
“Many people think that once their company data is saved to a cloud, their office is digitally secured…”
While the cloud does store your encrypted files in its vault, the company is still responsible for securing the applications, data, virtual network, and operating systems. The company itself is in charge of the security in the cloud, including customer data, firewall and network configuration, identity and access management, and more. When you consider that the global average of a data breach was $3.86 million in 2018, according to a study from IBM Security and the Ponemon Institute, it becomes even more urgent to have additional security practices over and above the cloud itself.
Scott started 403Tech after working for a number of businesses, including one of the leading IT consulting companies in North America. Due to his experience and expertise, Gallupe was able to recognize the need for quality IT services and support in the small and medium business sector.
“The biggest misconception when it comes to migrating to the cloud is that…”
Everything is backed up. There seems to be a misunderstanding that once something is in the cloud, it is safe and backed up forever. Although some devices do provide this, it is not consistent across the board. SMBs need to be careful, as failure to have a plan in place may lead to data inconsistency or even loss. We always recommend chatting with an expert regarding your cloud migration.
Charles Lobert is the VP of Sales and Marketing at Vision Computer Solutions. He has been in the IT industry for nearly two decades and with VCS since 2004. Throughout the years, Lobert has held nearly every position at VCS and is responsible for several major organizational shifts within VCS.
“I believe that the most costly misconception that companies have when it comes to cloud migrations is that…”
It does not require in-depth planning and expertise and can be completed by anyone with a little technical knowledge. This is because a lot of the cloud providers out there sell their services in a way that they want business owners to believe that. It makes the cloud more appealing. The reality is that it is only easy if you have experience doing it. Working with someone who has a lot of experience with successful cloud migrations is key to being happy with the results when everything is said and done.
To be successful, you need to properly assess the workloads that you are interested in moving, design the cloud infrastructure to be able to support the workload, and then have a plan in place for the migration. When done properly, the end users should think that it was a very easy move, because for them it will be. However, that can only happen if a lot of work has been done in the background, in preparation and planning. Internally, our engineers have to spend hours each month in training and planning to make sure that we stay up to date on the latest cloud technologies and migration methods. Without that, we wouldn’t be able to provide the migration experience that our clients expect.
So don’t think that it is just as easy as saying, “I want to move this server from my office to the cloud,” and expect that anyone with some technical ability can do it. Make sure you work with a cloud expert to ensure that you aren’t pulling your hair out in frustration as you work through the migration.
Jeremy Kaplan is the VP of Sales & Marketing at NOYNIM. NOYNIM is Colorado’s leader in managed IT solutions, and they are making huge strides in innovating the small- to medium-size business security and IT sector.
“In most cases, large companies make acquisitions without doing…”
Their due diligence and never explore the IT environment of their new company. There can be mysteries lurking below the surface, and those unknown factors could have severe consequences for your company in the future. Our team at NOYNIM recommends contacting a third-party managed IT service and securities firm to perform an assessment of that company’s IT infrastructure prior to executing the purchase.
By pulling back the curtain and seeing how their IT department functions, you may find the good, the bad, and the ugly security risks that you will be responsible for fixing if you choose to proceed with the acquisition.
Key areas of concern:
- The company has outdated systems and doesn’t have proper maintenance agreements in place.
- The company lacks a qualified IT provider and/or internal staff to administer IT systems.
- They lack a Disaster Recovery and/or Business Continuity Plan.
- Documentation is out of date or non-existent.
- They lack processes and procedures surrounding IT management.
- They lack automation and centralization of IT processes.
Eric Leland is the Principal and Partner at FivePaths LLC.
“Small businesses make up the vast majority of businesses in the USA and face particularly daunting challenges when it comes to assessing security during cloud adoption and migration…”
Firms with fewer than 20 workers make up nearly 90% percent of all businesses in the USA. Firms with few employees often lack the in-house technical expertise to properly understand their own security responsibility when it comes to cloud systems, and make costly errors as a result.
A very common error small businesses make is assuming that a cloud service provides essential data protection and recovery, as basic as backups, encryption, and software patch management. For example, many businesses migrate to cloud-based website solutions that deploy web content management tools that require minimal but critical ongoing maintenance to stay efficient, safe, and secure. The services are often marketed as cloud to imply full service, when in fact the small business needs to provide for a basic level of maintenance to avoid potential disaster down the road.
After migrating from a less to more capable web system, the complexities and maintenance are often not transparent to the customer until later, after problems creep in due to inattention to maintenance. The good news is that some web services firms (such as fivepaths.com for Drupal CMS and valet.io for WordPress CMS) have built lower-cost models for filling the gap in services not provided by many cloud solutions adopted by small businesses, so they do not have to be left unprotected from common points of failure.
Chris is BlueVoyant’s Chief Security Officer. Prior to BlueVoyant, Chris was Booz Allen’s Chief Engineer for Commercial Cyber Engineering Services and Data Protection Solutions. He is an experienced advisor and has assessed, designed, and built information security programs for a variety of large organizations.
“There’s a pretty big list to choose from…”
But if I were to pick one as the most costly misconception with respect to security technology in the cloud, it’s the notion that “my Cloud Provider is handling security (including all the technology) for me.”
While AWS, GCP, and Azure have all recently released excellent security capabilities on top of their respective Cloud offerings, too many companies are slow to fully operationalize these. It’s analogous to the approach companies took with the “traditional” security perimeter back in the mid to late 2000s when people thought DO buy the latest firewall appliances, in-line network DLP, robust DMZ architecture, Network Authentication, and other network security tools, but DON’T invest in Defense in Depth and DON’T invest in the people and processes needed to extract Detection and Response capabilities from the technology stack.
To a large extent, security in the cloud is following the same path. Companies are saying “The Cloud is great. I don’t need to worry about networking anymore because that’s handled by my provider, and now all the network security is handled too.”
The reality is that network security was always necessary but never sufficient, and while compromising a cloud security perimeter is more challenging, advanced adversaries have already created repeatable processes to do this at scale. Moreover, it’s deja vu all over again with companies seeming to think that the technology (Security Hub, or Security Command Center, etc.) is the full solution to cloud security: Without skilled people and well-defined procedures, even the most sophisticated (with “advanced analytics” and “AI with Machine Learning”) cloud security tools can’t detect and respond for you.
To be secure in the cloud, you need full awareness of your plant and the fleet:
- What are all of your virtual assets (Accounts, VPCs, Endpoints of all varieties, Users/Roles, CI/CD pipelines) and what do they do? What workloads do they support?
- How do your assets normally behave (what are normal data flows between assets, what are normal configurations, what is normal behavior for each user/service account, and what are the typical resources expenditures)?
- Who does what with what assets? Know your user and service accounts like the back of your hand, and have strong attribution of all activity back to a user account — service accounts are necessary, but make sure you can tie those back to a user also!
- When something goes wrong, do you have response procedures and the right skill sets on hand to investigate all facets of your cloud environment (applications, infrastructure, virtual devices) without shutting down the plant and breaking the business?
Dmitry Sotnikov is Vice President at Cloud Platform at 42Crunch. 42Crunch is the only enterprise-grade, API security platform that addresses the development, testing, and deployment security requirements of an API infrastructure. The 42Crunch Platform can protect SaaS, Web, or IoT APIs, as well as microservices.
“Cloud architectures have different attack surfaces, and companies hoping to simply keep protecting the edge are in for a surprise…”
This is especially true on the API security side:
- Consumed APIs: Your cloud applications will no doubt consume APIs. Your storage might no longer be a local disk drive, but instead AWS S3. Your continuous integration / continuous deployment (CI/CD) system, your source code repository, your identity system — all of these can now be APIs accessed by your application. As the recent Samsung SmartThings snafu demonstrated, failure to ensure that access to all systems is locked and keys are nowhere to be found (especially source code repositories and logs) can lead to attackers quickly escalating their privileges and taking over.
- Exposed APIs: All the microservices in your new cloud architecture are APIs too. You can no longer rely on securing your application itself (for example, with two-factor authentication and a WAF) and putting an API gateway at the edge. Attackers will try to bypass them and go directly to each microservice trying to use it in an unexpected way and to exploit each vulnerability they can find. For the main property valuer in Australia, LandMark White, failure to realize that led to the company losing their business, getting taken from the stock market, and seeing the CEO and leadership team step down.
Christopher Gerg is the CISO & Vice President of Cyber Risk Management at Gillware. He is a technical lead with over 15 years of information security experience, dealing with the challenges of information security in cloud-based hosting, DevOps, managed security services, e-commerce, healthcare, financial, and payment card industries.
“The most costly misconception companies are clinging to when it comes to security technology is that the model is the same as having a server room…”
If you’re fork lifting your servers (virtually) to the cloud, you must keep in your mind that your virtual servers are running on someone else’s computers in someone else’s data center — all on a shared network. Despite software-defined network mechanisms that provide segregation, this is still an underlying truth. (I will not go into why simply fork lifting your servers to the cloud is a bad idea due to the fact that you are missing out on an opportunity to benefit from the scalability, cost savings, and control you get from containerized microservices using an elegant orchestration fabric like Kubernetes…)
This shared aspect can cause changes in your information security approach. For example, in the healthcare world, HIPAA requires that all traffic between servers be encrypted in a way that the cloud provider has no access to the data being transmitted on the network. This requires non-trivial effort. There are also challenges to traditional controls like intrusion detection, key management, and firewalling. Many of these services require additional (often expensive) tools to run in a cloud environment.
Sivan Tehila, Director of Solution Architecture at Perimeter 81, is a cyber and information security expert with 13 years of experience in cyber management, defense industries, and critical infrastructures. She is dedicated to promoting women in cybersecurity, having founded the Leading Cyber Ladies community in NYC and Cyber19w in Israel.
“Today’s cyber attacks are more sophisticated and harder to detect…”
Which means sensitive data is more vulnerable than ever. The push to move everything into the cloud over the past several years has generated a large number of misconfigured and exposed deployments of various software.
Many organizations choose to implement a range of innovative security technologies in order to deal with the new generation threats. Implementing those solutions seems essential to IT managers for improving visibility across cloud deployment. But many times, combining many tools creates another challenge, which we used to call the “Tool Sprawl” challenge. Or in other words, investing in a range of security products makes it harder for IT teams to manage and orchestrate them in the network.
The main consequences of this problem are:
- High Cost: Implementing different tools and the need to train employees and IT teams to work on different platforms.
- Less Effective Threat Defense: IT teams pose an orchestration challenge trying to manage a “Tool Sprawl” environment. Sometimes it might even put the organization at risk when it comes to identifying an attack or incident response. Instead of responding rapidly to an attack, it will take the organization a long time to collect the logs and notifications and understand what kind of attack they are dealing with and what action they should take.
To gain a high return on investment (ROI), organizations need to move away from dealing with lots of different cyber solutions and vendors into a unified platform that is easy to use, manage, and consume security solutions from a range of different vendors through the cloud as a service.
Chris Weber is the co-founder of Casaba Security.
“Something I’ve seen is when an IT group tries to transpose their legacy data center topology or configuration onto a cloud platform…”
Especially when moving to PaaS, trying to hold onto ideas about networks, systems, and VLANs will only lead to trouble. Those concepts don’t apply anymore for the most part. When I see people make this mistake, they seem to be doing it out of fear and lack of knowledge of the new cloud environment. But people who have spent their careers building on those legacy concepts need to understand and trust the differences.
Callum Tennent is the Site Editor at Top10VPN.com, a VPN review website and digital privacy research group. Specializing in VPN software, he also writes on digital privacy and online security, always advocating for free and open internet.
“The most common misconception about contemporary security technology and cloud migration is that…”
Enterprise VPNs (the basis of the traditional network security model and antithesis to cloud-native architectures) are more secure than cloud-based solutions, or even still relevant at all. A VPN’s ability to provide employees, clients, and third parties “secure” remote access to internal applications is still considered a necessity today; yet what was once a simple remote utility has now become an increasingly inefficient and insecure solution. Cloud applications and cloud-native security models are superseding the traditional VPN-reliant system.
Ultimately, using an enterprise VPN perpetuates a traditional and outdated perimeter model in which a user’s position on the network defines their credibility and suitability for access to certain assets. In this situation, the network is not designed around users and the specific resources they need to access, but around branches, premises, and data centers. Operating in this way — based on physical location rather than identity — eventually falls short, especially as the perimeter becomes less clear.
Converting to an access model utilizing identity aware proxy (IAP) ensures that on login, users are not authenticated once or twice but continually, and their activities are checked for anomalies. Security teams can then use a cloud-based software-as-a-service solution that streamlines application access without granting entry to an entire privileged network. This allows end users to receive case-by-case access to individual applications specifically mapped to the operator’s identity.
As more companies choose to host their resources on the cloud rather than internally, VPNs are also becoming increasingly inefficient, as well as less secure. Why route traffic to a VPN concentrator in an enterprise data center to determine trust, then immediately route the traffic back out to a cloud provider to access the requested application? Cloud-native architectures offer far more resource-efficient IT operations, smoother authentication, and reduced risk.
This is not to suggest that the future of remote access is simply unregulated connection to the corporate network. Rather, that the majority of services workers access today via a traditional VPN already have a more efficient and secure means of connectivity through their native infrastructure.
The misconception here is that the cloud is not as safe as a company’s on-site servers. This confuses control with security. In truth, 60% of all data breaches are from insiders, not outsiders. Safety depends on who you trust for your cloud. If you choose a smaller company that’s relatively unknown to entrust your sensitive data to, you’re probably not very secure at that point. Large, powerful companies like Google, however, are much safer than most company’s internal servers: with a proprietary security protocol and proprietary network hardware. What’s more, since your data is automatically stored across multiple data centers, you’ll still be able to get to it, regardless of what happens to any individual one.
Steven Solomon is a business leader with over a decade of sales experience in the information technology industry He is the Senior Account Executive at Mode2.
“A costly misconception companies can make when they choose to migrate their data to a cloud service provider is that…”
They may not believe adjusting their risk framework is necessary. Access control and identity management are two important topics to prioritize for any cloud service, and you should adjust your risk framework accordingly. For instance, you should focus on password-protecting access to all of your cloud services and requiring multi-factor authentication to gain access to these systems. You also need identity management policies that prevent external users from accessing sensitive data by making it domain-specific to users within your organization. These are some common ways to control access to your data to authorized individuals within your organization, and it starts by adjusting and prioritizing risks specific to cloud services.
Completing a risk framework prior to performing a cloud migration will help you protect your data and prevent unauthorized access.
Nikolai Tenev is a software developer and founder of DigidWorks. He has over 10 years of experience creating ERPs, CRMs, and other business apps. He is mostly interested in the ways businesses benefit from tech.
“The absolute winner, when it comes to security misconceptions is the…”
“I’m on Linux, so I’m safe.”
This is a specific case of the broader, “It won’t happen to me.” A badly (or not at all) configured Linux machine poses a big threat to security. This should be avoided at all costs. Regarding the broader case: I’ve actually had clients tell me, after I let them know of a security risk in their systems, “Nah, it’s OK, don’t worry about it.” And this was a B2B SaaS company, which means it’s not only their stuff that’s at risk.
Justina Rimkeviciute is the Marketing Director at ProForecast. Passionate about forecasting, KPI integration, and her dog, Justina brings her industry knowledge and experience to the frontlines of ProForecast. She is now responsible for aligning the company’s marketing initiatives and brand awareness to its ambitious growth plans.
“We work with a lot of companies that are in the financial industry, and…”
Many that are in somewhat more traditional markets, like manufacturing and hospitality, and so on. And we very often encounter companies being afraid of moving to the cloud.
The common misconception about public cloud computing that we hear is that the organization must give up the security of their own data as cloud service providers take over the responsibilities, but that’s not true.
Data stored in the cloud is nearly always stored in an encrypted form that would need to be cracked before an intruder could read the information. And commercial cloud storage systems encode each user’s data with a specific encryption key. Without it, the files look like nothing — rather than meaningful data.
On top of that, a large part of defining compliance policies and identifying classified data remains the companies’ responsibility. So, companies are still responsible for a large part of security, and are in fact, just adding more security to their data, rather than completely relying on the cloud platform.
Sign up for a demo if you would like to learn more about Threat Stack’s Cloud Security Platform®, which now incorporates its unified application security monitoring solution — Threat Stack Application Security Monitoring.