Litton Power, Author at IT Business Edge https://www.itbusinessedge.com/author/lpower/ Fri, 17 Jun 2022 19:01:21 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 What is Operational Analytics? https://www.itbusinessedge.com/business-intelligence/operational-analytics/ Mon, 20 Jun 2022 15:38:00 +0000 https://www.itbusinessedge.com/?p=140574 Odds are your business employs some method of operational analytics or uses another closely related method of data processing with a different name. Whether it’s called hybrid transaction and analytics processing (HTAP), hybrid operational/analytics processing (HOAP), translytics, or continuous intelligence, what’s being described is nearly synonymous with operational analytics. Regardless of the name, operational analytics […]

The post What is Operational Analytics? appeared first on IT Business Edge.

]]>
Odds are your business employs some method of operational analytics or uses another closely related method of data processing with a different name.

Whether it’s called hybrid transaction and analytics processing (HTAP), hybrid operational/analytics processing (HOAP), translytics, or continuous intelligence, what’s being described is nearly synonymous with operational analytics.

Regardless of the name, operational analytics is a business strategy of leveraging real-time information to enhance or automate decision making. It’s an attempt to replace the traditional model of forming corporate decisions around quarterly or annual reports with making responsive pivots off of data as it’s processed in the present. It’s basically turning business intelligence and analytics insights into action at the application and systems level so users can put those insights to work.

Also read: The State of ITOps: Digital Transformation, Technical Debt and Budgets

How Does Operational Analytics Work?

The key to the success of operational analytics is the timeliness and freshness of data.

Fresh data comes into an enterprise through a variety of means, whether it be analytics data gathered from mobile apps, self-submitted customer feedback forms, documentation built on a collaboration platform, or customer data entered into customer relationship management (CRM) software.

As it streams in, different enterprise departments will share data more fluidly, finding value in innovative ways. For instance, the customer support desk may cross correlate its service tickets against customer sales records and prioritize service based upon how valuable the customer is. Or product and CRM data could be combined to better target sales and marketing efforts.

Operational Analytics in Practice

Power suppliers

In some ways, the practice of operational analytics can be said to have originated in the energy sector, which processes huge volumes of analytics and responds almost instantaneously, often with the benefit of artificial intelligence (AI).

Electricity suppliers are in a constant struggle to provide a balanced load across the energy grid, adjusting output as needed for both industrial and residential consumers.

Power consumption is gauged by the second, and as the demand goes up power plants burn hotter, boil more water, produce more steam, spin the turbines faster, and output greater amounts of electricity.

It’s a monumentally complex process that takes in data gathered across thousands of miles of infrastructure and automatically makes adjustments down to the second because even a momentary lapse in energy production is consequential.

Video game developers

Video game developers are also using operational analytics to an increasing degree, particularly as they debut actively developed products through early access programs like those on Valve’s Steam platform.

Some developers gather extensive data on player tendencies and preferences, what encounters or levels give players the most difficulty or the greatest ease, average play times, how many players actually finish the game, bugs encountered, crashes, freezes, and much more. This data is harnessed throughout the development cycle to make fixes, tweaks, buffs to weak mechanics, nerfs to overpowered ones, and so on.

This application of operational analytics has proven most valuable in competitive games, where achieving the optimal balance between characters or teams is a never-ending battle.

Online retailers

Online retailers have become one of the biggest and most controversial adopters of operational analytics strategies. Many retailers monitor every aspect of their customers’ behavior, serving product recommendations and ads tailored to their customers’ preferences.

These dynamic recommendations are powered by machine learning (ML)-enabled AI recommendation engines. Furthermore, even prices can be dynamic, fluctuating based on the geolocation of the customer’s IP address and potentially reflecting the in-store prices near the customer.

At the scale of a company like Amazon or Walmart, operational analytics is a necessity when it comes to inventory management as well. These companies have warehouses, distribution centers, and even trucking companies dispersed throughout all of North America.

Each day they process millions of orders, and the concentration of these orders creates the expectation that more product will need to be warehoused, more trucks will need to be supplied, and more staff will be required to pick and pack each order at the corresponding points of greatest demand.

The integration of real-time data across these organizations, even as their facilities span the continent, enable such companies to meet their rigorous supply chain demands and automatically trigger resupply orders from their partners if a shortage is anticipated.

Pitfalls of Automation

A few years ago, Amazon fully embraced its mastery of customer data, harnessing those daily analytics to produce a same-day-delivery service that would serve communities with the highest density of Amazon customers.

The expected result would bolster customer satisfaction and increase revenues while customers saved themselves a trip to the store because their intended grocery would arrive at their doorstep later that same afternoon anyway. Only through the power of data harvesting and artificial intelligence could such a strategy be successful.

However, Amazon’s AI produced an efficient same-day-delivery service map that prioritized wealthy neighborhoods and glaringly excluded poor ones. As a result, the company received backlash for what was seen as a discriminatory service map, and Amazon quietly reconsidered its approach.

Thoughtful Implementations

Operational analytics can produce highly valuable returns for a company, but it’s important to exercise human judgment and foresight before executing what might seem like a profitable idea. Actions that customers might find irritating or even offensive have a downside that might not show up in a strict data analysis.

Read next: Leveraging Conversational AI to Improve ITOps

The post What is Operational Analytics? appeared first on IT Business Edge.

]]>
Unifying Data Management with Data Fabrics https://www.itbusinessedge.com/storage/data-fabrics/ Fri, 17 Jun 2022 19:01:18 +0000 https://www.itbusinessedge.com/?p=140572 The concept of the data fabric emerged in 2015 from NetApp. It was later redefined after three years as implementations matured. And as the rest of the data storage industry pushed their own data fabric solutions, the initial concept started to lose its original meaning. While it is not an uncommon occurrence for emerging concepts […]

The post Unifying Data Management with Data Fabrics appeared first on IT Business Edge.

]]>
The concept of the data fabric emerged in 2015 from NetApp. It was later redefined after three years as implementations matured. And as the rest of the data storage industry pushed their own data fabric solutions, the initial concept started to lose its original meaning.

While it is not an uncommon occurrence for emerging concepts to change during their formative development over time, the lack of clarity can create confusion for those in need of the technology. Here we’ll discuss how data fabrics are evolving – and how they can help distributed enterprises better manage their far-flung data operations.

See the Top 7 Data Management Trends to Watch in 2022

What is a Data Fabric?

In a 2018 talk by NetApp’s data fabric chief architect, Eiki Hrafnsson, he outlined the Data Fabric 1.0 vision as “essentially being able to move your data anywhere; whether it’s on-prem, the enterprise data center, or to the public cloud.”

In a theatrical and entertaining tech demo, NetApp engineers debuted this technology in 2015 by rapidly transferring 10GB of encrypted data between AWS and Azure cloud, all from a simple drag-and-drop interface.

This addressed a real change toward fluid data transfer between mediums, something like a storage network for the Big Data and cloud era. However, years later, this kind of performance is now generally expected, causing a shift in the development of the data fabric and what it could be used for.

According to Gartner, a data fabric is:

“ … a design concept that serves as an integrated layer (fabric) of data and connecting processes. A data fabric utilizes continuous analytics over existing, discoverable, and inferenced metadata assets to support the design, deployment, and utilization of integrated and reusable data across all environments, including hybrid and multicloud platforms.

Comparatively, IBM defines a data fabric as:

“ … an architectural approach to simplify data access in an organization to facilitate self-service data consumption. This architecture is agnostic to data environments, processes, utility, and geography, all while integrating end-to-end data-management capabilities. A data fabric automates data discovery, governance, and consumption, enabling enterprises to use data to maximize their value chain.”

While both definitions borrow from the original concept, the idea of what a data fabric is has become more complex in order to keep up with current data trends.

Also read: Enterprise Storage Trends to Watch in 2022

Data Fabric 2.0

NetApp reassessed their idea of the data fabric in the years following its debut, redefining the concept thusly: “The NetApp Data Fabric simplifies the integration and orchestration of data for applications and analytics in clouds, across clouds, and on-premises to accelerate digital transformation.”

In other words, the scope and functionality expanded to better integrate existing enterprise applications with data sources, making the programs agnostic to the source media.

NetApp claims this fabric architecture carries numerous benefits:

  • It creates a better posture to resist vendor lock-in by liberating data and offering the freedom of choice between cloud providers or on-premises, switching at any time you like.
  • It empowers data management, increases mobility by knocking down silos, facilitates cloud-based backup and recovery, and may also improve data governance, the company says.
  • Data fabrics enhance data discovery by granting full-stack visibility with their suite of visualization tools.

Other companies such as Talend have their own data fabric analytical tools, many of which extend the fabric to both internal and external consumers and contributors through the use of APIs.

Data Fabric Challenges

Most companies today house their data in multiple locations and in a variety of formats; therefore, data fabrics can’t always have access to all data. Moreover, the distributed nature of the data often leads to poor data quality, which can skew data analysis when aggregated.

According to a study in the Harvard Business Review, a mere 3% of companies’ data adhere to the study’s standard of data quality. The study also found that nearly half of all newly-created records contain a critical error.

According to Talend, creating a unified data environment can alleviate these quality control issues by giving IT greater control and flexibility over the end product. Their tools, the company says, build better data stewardship, more effective data cleansing, and better compliance and integrity through data lineage tracing.

Data Fabrics and Data Management

Tools like data fabrics can make the job of data quality control easier, but if they’re wielded incorrectly, then the company may find itself spending more to make up for the issues with data or analyses.

How we interact with our data is only half the bigger picture. The other half is how we create it. Data tends to be created on the fly, and to serve a limited, time-sensitive purpose. A data fabric can help IT wrangle bad or outdated data more quickly, but ideally, we should also be mitigating these issues on the front end as data is created.

If you’re curious to see a demonstration of data fabric tools to see how you might leverage them in your company, check out this hour long talk from NetApp’s chief data fabric architect.

Read next: Top Big Data Storage Tools

The post Unifying Data Management with Data Fabrics appeared first on IT Business Edge.

]]>
Is 5G Enough to Boost the Metaverse? https://www.itbusinessedge.com/development/metaverse-5g-boost/ Mon, 18 Apr 2022 19:31:08 +0000 https://www.itbusinessedge.com/?p=140380 With 5G hitting the airwaves, the development of the metaverse is set for rapid growth. However, there are still hurdles to overcome.

The post Is 5G Enough to Boost the Metaverse? appeared first on IT Business Edge.

]]>
Techno-visionaries and speculative fiction authors have long entertained the notion of a fully virtualized world—one where players can game in a realistic 3D space, hang out in virtual social spots, or even hold church services for massive congregations piping in from all across the world.

In 1992, the author of several mind-bending sci-fi novels, Neal Stephenson, gave this concept a name: Metaverse. Companies like Valve, Oculus, and now Facebook (rebranded as Meta) have chased this dream with mixed success, and in the latter’s case, some controversy.

One of the limiting factors of virtual reality’s (VR) success has been its technological maturity; however, with recent development of 5G and the metaverse, VR seems to be following a similar path as the iPhone.

While it wasn’t the first of its kind, Apple’s flagship smartphone offered an attractive and overall useful package to consumers, making it a success, especially in its second generation with the support of the 3G cellular network. The drive toward mobile data usage and the technologies deployed by major U.S. telecommunications companies pushed smartphones into ubiquity.

Similarly, recent telecommunications technologies seem to be pushing virtual reality into rising popularity. With 5G hitting the airwaves, brand new bandwidth is opening up, leaving the telecommunications industry wondering what the next app that will take advantage of this new capacity is. Their answer is the metaverse.

Also read: What is the Metaverse and How Do Enterprises Stand to Benefit?

Virtually Everything

Verizon foresees a future where virtual reality and augmented reality (AR) are as commonplace as smartphones are now, enabled by a massive increase in data transfers from a nationwide 5G network.

As they describe it, metaverse will transcend beyond gaming and open up new possibilities, such as allowing shoe shoppers to use AR to try on a pair of virtual sneakers or cosmetics before buying the real thing.

The trick is, they don’t just want to deliver the experience; they want to sell the experience, too. As such, Verizon is putting real money behind this, launching metaverse experiences such as a fully virtualized Super Bowl.

And they aren’t alone. China Mobile kicked off its Mobile Cloud VR last year, which is a virtual socialization and shopping app supported by 5G. In addition, SK Telecom recently launched its own metaverse platform.

These companies saw the profits Apple and Google swept by leveraging 3G and 4G advancements, and they seek to get ahead of everyone else by planting their flags in the VR/AR space with their own apps.

How to Experience the Metaverse

High-quality virtual reality and augmented reality experiences can be had right now, but they come with significant limitations. An Oculus Quest 2 is a powerful device that costs less than what people pay for cell phones every year, but all that hardware is packed into an awkward, weighty package that can cause discomfort during prolonged play sessions.

The ill-fated Google Glass promised to bring maps, your calendar, the weather, and a host of other augmented reality services right before your eyes wherever you go. Despite an interesting premise, the product never found its footing, though Google hasn’t given up on it yet.

The right formfactor to experience a metaverse has yet to emerge, it seems.

Also read: The Metaverse: Catching the Next Internet-Like Wave

What’s the Real Vision Here?

Nevertheless, 5G providers like SK Telecom remain optimistic. The company’s vice president Cho Ik-hwan has even commented that the metaverse will become their core business platform as they develop first-party applications meant to occupy what they see as a wide-open space.

“We want to create a new kind of economic system,” said Ik-hwan. “A very giant, very virtual economic system.”

It’s unclear how SK Telecom will achieve that goal. At present, the company and others like it are investing in the development of VR/AR smartphone apps, but a cell phone with a 6-inch display screen doesn’t seem like an attractive formfactor to experience a transcendent metaverse adventure.

Further, the concept of a metaverse is still vague and formative, and even a $10 billion investment from Facebook has yet to give it focus or profit.

Similarly, Verizon’s approach seems unfocused, even self-contradictory. The company promises their metaverse experience will be without limits in a sentence immediately following a statement that you will “be required to abide by rules and regulations just like you would in the real world.”

That type of thinking exposes the real challenge telecommunications faces on this frontier. In this endeavor, they are stepping well outside their existing business models of steadily building infrastructure and entering into a field that demands artistic creativity and dynamism.

That field is more compatible with the “move fast and break things” mentality of Silicon Valley, and even Facebook is fighting an uphill battle.

Read next: Emerging Technologies are Exciting Digital Transformation Push

The post Is 5G Enough to Boost the Metaverse? appeared first on IT Business Edge.

]]>
What is Open RAN? https://www.itbusinessedge.com/networking/what-is-open-ran/ Mon, 04 Apr 2022 17:35:44 +0000 https://www.itbusinessedge.com/?p=140318 The Open Radio Access Network is an effort by telecoms to break vendor lock-in as 5G's rollout continues. Here is what that means for enterprises.

The post What is Open RAN? appeared first on IT Business Edge.

]]>
There’s been a quiet revolution unfolding across the global cellular network as fourth generation technologies give way to the fifth. 5G transmissions run two orders of magnitude faster compared to their 4G predecessor, but a lot of infrastructure must be put into place to support the increasing transference of data. 

While telecoms are upgrading cell towers and other components of their networks, they have been presented with an opportunity to change some of the old ways of doing business in the hopes of reducing costs and driving innovation. From this wellspring emerged the Open Radio Access Network (Open RAN) movement, an effort by telecoms to break free of vendor lock-in, and allow more hardware providers to participate in an expensive market with very few players.

Market Drive for Open RAN

Under the traditional model, telecommunications equipment vendors made up a small crowd, composed almost exclusively of Nokia, Ericsson, and Huawei. Their hardware is proprietary, secretive, and non-interactive with their competitors. Telecoms get locked into a vendor ecosystem and forced to pay high prices as a consequence of that exclusivity. Switching ecosystems is similarly expensive, with similarly high prices demanded by the competition. This locked and controlled marketplace has left the telecoms seeking an alternative, and the demands of 5G networks only further applied pressure to the problem. 

The answer to this is Open RAN. Members of the industry united in their efforts to develop new standards and specifications that would allow any hardware or software vendor to participate, without the barrier of the proprietary standards that exist today. Imagine building a computer, and every component inside the computer must come from the same manufacturer. When it comes time to upgrade RAM or a CPU, there may be some attractive options out there, but you’re forced to purchase RAM from a single provider. Under the Open RAN model, you can purchase hardware from whomever you please, and the broader system still works.

Also read: Edge Computing Set to Explode Alongside Rise of 5G

The Origins of Open RAN

Many of the formative moments in the Open RAN movement can be traced back to early 2018, when Samsung announced its new partnership with Verizon. Samsung had just become a supplier for the telecom’s network hardware, entering into that once tightly controlled field. Verizon had been seeking new, more open protocols as opposed to the proprietary ones that vendors like Samsung could previously not interact with. Simultaneously, competitors such as AT&T, Deutsche Telekom, and China Mobile formed the oRAN Alliance, founded on three principles:

  • Lead the industry toward open, interoperable interfaces
  • Minimize the use of proprietary hardwares while maximizing off-the-shelf alternatives
  • Develop and standardize APIs

“To take full advantage of the flexibility of 5G, we have to go beyond the new radios and change the overall architecture of the end-to-end system,” AT&T’s chief technology officer said at the time. “Open modularity, intelligent software-defined networks, and virtualization will be essential to deliver agile services to our customers. ORAN will accelerate industry progress in these areas.”

A Boost from DoD

In 2021, a US Department of Defense initiative encouraged industry to adopt greater openness in communications hardware and software. Fast and fluid communications are essential to lifting the shroud of war in modern combat theaters, and setting up a communications network is a top priority whenever the military is deployed into a new area. In an effort to strengthen its own communications backbone, DoD, in conjunction with the National Telecommunications and Information Administration, issued a notice of inquiry to telecoms, requesting that the members of industry seek ways to open up the 5G stack ecosystem, speed up innovation, and increase interoperability. Most importantly, the military wanted ways to choose from a broader pool of hardware vendors to increase their data processing abilities at the edge and in the field.

Open RAN: Boon or Bust?

Optimists within the telecom industry hope that by allowing more hardware vendors to participate, the industry at large is better posturing itself for future upgrades, while also lowering the costs of installing or maintaining infrastructure. Radio networks are multi-billion dollar investments, and even minute cost savings on a single component can have ripple effects throughout the network. 

That notion, however, has been met with skepticism, as early adopters have yet to realize some of the profit potential they foresaw. Industry fragmentation is another challenge that telecoms are navigating. There are many proponents of open RAN, and not all of them are developing the same standards. 

While the open RAN model has existed in theory for many years, its implementation is still in its infancy. There are still open questions as to how disruptive it will be to the telecom industry, but the major carriers such as AT&T and Verizon remain optimistic. Moreover, even one of the proprietary telecom hardware vendors, Nokia, has joined the fray, working alongside the oRAN Alliance—a sign that even the old guard can sense the changing winds.

Read next: 5G Cybersecurity Risks and How to Address Them

The post What is Open RAN? appeared first on IT Business Edge.

]]>
8 API Security Measures to Implement Now https://www.itbusinessedge.com/applications/api-security-measures/ Fri, 25 Mar 2022 13:00:00 +0000 https://www.itbusinessedge.com/?p=140284 Application program interfaces form bridges between applications. Here is how to safeguard them from cyber attacks.

The post 8 API Security Measures to Implement Now appeared first on IT Business Edge.

]]>
Application program interfaces (APIs) form bridges between applications, enabling programs to talk to each other across differing code bases and hardwares. But in the wrong hands, APIs can inflict potentially massive damage

Enterprise applications form bigger and bigger attack surfaces, but often it’s the APIs where the real vulnerabilities lie. While many attacks may be detected and thwarted through standard firewalls and SIEM tools, attacks through APIs move more stealthily, as they often leverage the access privileges the API already allows. These vulnerabilities extend well beyond the enterprise realm, and are even potentially affecting your personal vehicle. So ubiquitous are insecure APIs, that they’re even being used to hack Teslas

Here is how to  adopt a more rigorous security posture with APIs by implementing the following strategies.

1. Build For Future Users, Not Present Ones

When APIs are in their infancy, they are often designed to satisfy the needs of a small team of developers working together. These developers know each other, maybe even share an office space, and may feel little need to implement authentication protocols to establish that everyone is who they claim to be. Why should they? Before long, a particularly useful API finds its way out of the team, and it crawls its way to a broader network of users than was originally intended. The appropriate security measures should be in place before the genie gets out of the bottle, rather than long after. 

2. Limit Users

Speaking of future users, plan for many but control for fewer, if possible. Authorize access on a strictly need-to-know basis. More users means a greater attack surface, particularly if privileges aren’t clearly and thoroughly defined. 

3. Limit Data

The Equifax data breach represents the sum of all fears, as the company housed private financial information for nearly 150 million Americans. Fortunately, not every company’s business model necessitates the collection of social security numbers, driver’s licenses, addresses, and so on. Narrowly tailor data collection so only the most necessary data is required. Uncollected data is safeguarded.

4. Encrypt Data

Ensure that communications pathways are using the appropriate encryption protocols such as SSL or TLS. Similarly, data at rest should be encrypted. This may seem like obvious advice, but all too often a data breach occurs because accounts and passwords were stored in plain text. Simply having encryption isn’t enough, it also has to be used correctly. Some protocols such as TLS allow for encryption verification to be disabled on the server or the client side, resulting in a potential exposure for internet traffic to be intercepted. Ensure that APIs conform to the latest security best practices to ensure that communications are safe and secure. 

Read more about API security: 7 Trends in Network Management APIs

5. Enact Pagination Limits

Without proper API pagination, server queries can return one result or one hundred billion. The latter scenario would quickly devour system resources and bring applications to a halt. Even worse, it doesn’t require a malicious actor to cause harm—an innocent user might frame a query too loosely, and receive a staggering response. Fortunately, pagination is easy to implement. The easiest form of which is offset pagination, which provides users with a predefined window of records that they can retrieve. Other forms of pagination include keyset and seek, which have their own benefits and disadvantages.

6. Use Prepared Statements in SQL Queries

SQL code injections are incredibly prevalent attacks, giving attackers the ability to pose as other users, damage databases, or steal data. As is implied by the name, the attacker sneaks SQL code into a database query, often through the abuse of escape characters that a properly configured server should have filtered out. Prepared statements inhibit an attacker’s ability to inject SQL code by blocking them with placeholders that are only able to store specific values, and not SQL fragments. Another method of preventing SQL injections is to ensure data inputs match what is expected. For instance, phone numbers should register as integers and not contain strings. Names should contain letters but not numbers.

7. Strengthen End User and Application Authentication

For users accessing applications, implement routine password reset policies in accordance with the latest security best practices. For the applications themselves that interact with APIs, use unique credentialing for each version of the application, making it easier to root out out of date versions.

8. Impose Rate Limits

Brute force attacks happen when an attacker sends high volumes of login credentials to a server in an effort to make a successful match through sheer chance. A basic rate limit can thwart these attacks, by preventing more than one query from occurring within a reasonable time frame. Would a human being be capable of entering their password four hundred times in a minute? Likely not. So why would an API accept such an unreasonably high number?

Managing Risk

Security is the art of managing risk, not eliminating it. No fortress is impregnable, but attackers tend to move in the path of least resistance and target victims with poor security standards. Ratchet down your API security, and be the target attackers know to avoid.

Read next: Application Security Code Reviews: Best Practices

The post 8 API Security Measures to Implement Now appeared first on IT Business Edge.

]]>
What is Cybersecurity Mesh Architecture? https://www.itbusinessedge.com/security/cybersecurity-mesh/ Tue, 22 Mar 2022 20:10:57 +0000 https://www.itbusinessedge.com/?p=140269 The CSMA model reins in threats through a more holistic, collaborative focus on security. Here is how this security approach works.

The post What is Cybersecurity Mesh Architecture? appeared first on IT Business Edge.

]]>
When Gartner predicts that “Cybersecurity Mesh Architecture (CSMA)” will be one of the top security and risk management trends of last year and this year too, that news might come as a surprise to those of us who have never heard of it. Indeed, the term seems to have been roughly conceptualized by Gartner in an effort to develop a cybersecurity architecture that, in the firm’s own words, can reduce the cost of security incidents by roughly 90% over the next couple years. That’s a bold claim, so how do they back it up?

What is Cybersecurity Mesh? 

CSMA is essentially a set of recommendations issued under the governing philosophy that security tools should play nice together. Gartner has identified a growing gap of interoperability between security tools, as well as significant, wasteful overlaps in what multiple tools—each being paid for through their own licensing—seek to achieve. Under the framework of a cybersecurity mesh, each tool will be introduced into the IT infrastructure as an integrated, carefully planned out part of a greater whole. 

Also read: Best Managed Security Service Providers (MSSPs) 2022

The Perfect Storm of Cyberattacks 

In a recent report, Gartner analysts are predicting the “perfect storm” for cyberattacks in the coming future, instigated largely by three primary challenges to the present enterprise security landscape:

  • Cyber attacks and cyber defenses are asymmetrical in nature. While attackers pursue vectors outside of a silo, organizational security is often siloed. Security tools often don’t run in concert with other tools, leaving weak spots open to exploitation.
  • The defensive perimeter has become substantially fragmented, with the increase in remote work and prevalence of stray devices. Data is less centrally located, leaving the traditional perimeter of network security somewhat akin to the French Maginot Line: a powerful fortification that was easily sidestepped by invaders.
  • Multicloud computing environments demand a more consolidated security approach. Often, different cloud providers will establish their own security policies, resulting in inconsistent enforcement of standards.

The report continues to assess the modern digital landscape, criticizing the overly fragmented nature of existing security architectures. The spread of digital devices across an increasingly thin hybrid cloud has done more than strain legacy security tools, it has also placed a growing burden on computing resources. Multiple poorly implemented tools may overlap in responsibilities across multiple and sometimes redundant dashboards, administration points, and ad hoc integrations. 

There’s some truth to those claims, according to a 2020 industry survey sponsored by IBM, which found that organizations on average enlisted 45 security tools, and respondents sought to dramatically reduce that number. 

In view of these challenges, Gartner developed the CSMA model to rein in threats through a more holistic, collaborative focus on security.

The Cybersecurity Mesh Architecture Approach

Gartner describes CSMA as “a composable and scalable approach to extending security controls, even to widely distributed assets.” Their proposed model is geared toward hybrid and multicloud environments accessed by a wide range of devices and applications. In short, they envision the implementation of security tools with high degrees of interoperability, running through four supportive layers that facilitate collaboration between security controls. Their four proposed layers consist of:

  • Security Analytics and Intelligence: Processes data from past cybersecurity attacks to inform future action and trigger responses.
  • Distributed Identity Fabric: Decentralized identity management and directory services.
  • Consolidated Policy and Posture Management: Integrates individual security tool policies into a greater unified whole.
  • Consolidated Dashboards: Single pane management of the security ecosystem.

Gartner makes some additional recommendations to better integrate security frameworks:

  • Select security tools on the basis of interoperability, and invest in developing a common framework.
  • Select vendors with open policy frameworks so policy decisions can be delegated from outside the tool.
  • Select aggressive, forward-thinking vendors.
  • Adopt multi-factor authentication and zero-trust architecture.
  • Transition away from VPNs and adopt zero-trust, cloud-based access management.

Single or Primary Vendor Security

Many of the concepts advanced under the label “Cybersecurity Mesh Architecture” can largely be distilled into an otherwise simple solution: single or primary vendor security. If security tools are failing to work in concert, then it may be time to pursue consolidation to a security stack from a sizable vendor such as IBM or Symantec. In Gartner’s own report on CSMA, the company cites positive outcomes from this approach, such as an improved dashboard integration and reductions in licensing costs. 

There will still be a need to adopt specific out-of-vendor tools to fill niche roles, and under the guidance of Gartner’s CSMA report, those tools should be carefully integrated into the existing security stack using open standards or APIs. 

Read next: Top Cybersecurity Companies & Service Providers 2022

The post What is Cybersecurity Mesh Architecture? appeared first on IT Business Edge.

]]>
Enabling Data Security with Homomorphic Encryption https://www.itbusinessedge.com/security/data-security-homomorphic-encryption/ Fri, 25 Feb 2022 16:49:21 +0000 https://www.itbusinessedge.com/?p=140169 Homomorphic encryption enables users to edit data without decrypting it. Here is why it is showing promise for securing Big Data.

The post Enabling Data Security with Homomorphic Encryption appeared first on IT Business Edge.

]]>
Regardless of the strength of data’s encryption, more and more potential vulnerabilities surface in data security as more people are granted access to sensitive information. However, a relatively new encryption protocol poses a unique solution to these types of mounting privacy exposures.

Homomorphic encryption enables users to edit data without decrypting it, meaning the broader dataset is kept private even as it is being written. The technology may not be an ideal solution for everyone, but it does have significant promise for companies looking to protect huge troves of private data.

How Homomorphic Encryption Works

Homomorphic encryption was proposed in 2009 by a graduate student, who described his concept through an analogy of a jewelry store owner.

Alice, the owner, has a lockbox with expensive gems to which she alone has the key. When Alice wants new jewelry made from the gems, her employees wear special gloves that allow them to reach into the closed box and craft the jewelry using the gems without being able to pull them out of the box. When their work is done, Alice uses her key to open the box and withdraw the finished product.

In a conventional encryption model, data must be downloaded from its cloud location, decrypted, read or edited, re-encrypted, and then reuploaded. As files expand into the gigabyte or petabyte scale, these tasks can become increasingly burdensome, and they expose the greater dataset to wandering eyes.

By contrast, data that is encrypted homomorphically can have limited operations performed on it while it’s still on the server, no decryption necessary. Then, the final encrypted product is sent to the user, who uses their key to decrypt the message. This is similar to end-to-end encryption, only the receiver can access the decrypted message.

Also read: Data Security: Tokenization vs. Encryption

Use Cases for Homomorphic Encryption

AI-driven healthcare analytics have come a long way in recent years, with AI being able to predict disease and other health risks from large sets of medical data.

Today, services like 23 and Me allow customers to hand over sensitive medical information for genetic testing and ancestry information. But these companies have been hit with accusations of selling this personal information or providing it to third parties such as the government, without customer knowledge or consent.

If that data was protected through homomorphic encryption, the company would still be able to process the data and return its results to the customer, but at all times that information would be completely useless until it is decrypted by the customer, keeping his or her information entirely confidential.

Within the last two years, Microsoft, Google, and many other of the largest names in tech have been investing in developing the technology, even freely offering their open-source implementations.

In the case of Google, the company may be pursuing the technology as a means of complying with privacy regulations such as the European GDPR. With homomorphic encryption, Google could continue to build an ad profile, based on large volumes of personal data that it collects through various means, and compile it into an encrypted database with limited usage or applications that only the end user might experience.

For instance, a user may search Google for restaurants near them. The query would hit the homomorphic black box, privately process the user’s preferences and location, and return tailored results.

Types of Homomorphic Encryption

There are three common iterations of this technology, and one size does not fit all.

  • Partially homomorphic encryption (PHE): Allows for very narrow interaction with data, limited to a single mathematical function at a time
  • Somewhat homomorphic encryption (SHE): Perform up to two operations at a time
  • Fully homomorphic encryption (FHE): Several types of operations can be performed simultaneously, and an unlimited number of times. While most desirable, FHE incurs significant hits to system performance.

The Limitations of Homomorphic Encryption

Homomorphic encryption has yet to see widespread adoption. However, it’s not uncommon for encryption protocols to spend a decade in development.

There are community standards that need to be established. Public confidence that the technology is safe, secure, solid, and not exploitable needs to be reached. APIs need to be implemented. And lastly, perhaps the biggest hurdle for homomorphic encryption is that the technology needs to perform well.

No one wants to adopt a more secure protocol only to discover that system performance has taken a massive hit. From an end-user standpoint, that will feel more like a massive setback than a step forward. While the protocol has become massively more efficient since its inception in 2009, it still lags behind today’s conventional encryption methods, particularly as users move from PHE to SHE to FHE.

While the computational overhead is too large for many businesses that don’t need the added security, homomorphic encryption may yet become the go-to standard for sensitive industries like finance and healthcare.

Read next: Best Encryption Software & Tools

The post Enabling Data Security with Homomorphic Encryption appeared first on IT Business Edge.

]]>
Has Remote Work Really Benefited Enterprises? https://www.itbusinessedge.com/business-intelligence/has-remote-work-really-benefited-enterprises/ Thu, 10 Feb 2022 18:00:55 +0000 https://www.itbusinessedge.com/?p=140097 With the rapid shift to remote and hybrid work, business leaders are wondering if these changes are for the long-term good.

The post Has Remote Work Really Benefited Enterprises? appeared first on IT Business Edge.

]]>
As we drifted into the tumultuous events of 2020, I spoke with a former boss of mine, one who is a level-headed thinker with a gift for clarity under pressure. Like many business owners, she was scrambling to find footing in a rapidly shifting landscape. Ultimately, she took some precautionary measures that will now sound familiar: she closed the office and sent her employees home to work remotely, perhaps permanently. 

While the change was stark for many, there were others who were already working long distance. In some ways, the COVID-19 pandemic was the catalyst for a change in business that was already going to happen, just on an otherwise much longer time scale.

Many companies had already incorporated policies for work/life balance and flexible hours, shifting closer to remote or hybrid work. However, with how quickly change to remote and hybrid work has come, business leaders are left wondering if these changes were a good thing, or if there should be a shift back to more traditional work environments.

The Convenience of Flexibility

Making an hour drive for a monthly client meeting never looked so inefficient as it does now. Even if the whole trip takes three or four hours, that’s half the day gone, with likely a reduced willingness to engage with new tasks for the latter half of the workday.

Just the routine of commuting to and from the office could take over an hour a day for the average American in the pre-pandemic world. Remote work has given employees back their time, but how those gains translate into increased productivity is still a matter of debate.

Managers are still split on the question of whether remote work has improved or decreased employee performance, though they overwhelmingly see the remote work transition as a success that will likely stick around in some form for a while. Employees, on the other hand, find themselves more productive or about the same when working from home, according to a 2020–21 survey from FlexJobs.

More than half the employees in that survey expressed their preference for remote work, finding gratification in no longer having to commute and in increased cost savings and professional development opportunities.

Today, remote or hybrid work has become a new bargaining chip for talented prospective employees, with 58% of FlexJob’s survey respondents claiming they would look for work elsewhere if they could not continue remote work. Even employers who desire a full return to the office recognize this reality in the competitive hiring landscape.

Also read: Eight Best Practices for Securing Long-Term Remote Work

The Price of Flexibility

While many employees function better independently, some fare better in the office than at home, where they are surrounded by distractions. For example, project managers or team leaders working under deadlines can become frustrated with team members who neglect answering emails about important deliverables, with no option to visit their office in person for a status update.

Under the remote work paradigm, the expectation of an expedient reply to a text or email has become a point of contention, one that organizations will have to delicately renegotiate in the years to come.

Further, remote work has created some ambiguities in terms of when an employee can feel comfortable “switching off” for the evening or weekends. Those lines get blurred when an employee is already expected to be attached to their phone or email, and suddenly, the value of “work/life balance” loses its meaning.

Social capital is also sacrificed when teams are highly distributed. A synergistic, well-organized team is a beautiful thing, but those take time to build, and most of that time requires face-to-face interaction.

Zoom may be an upgraded version of the conference call, but it often reveals what we’ve always known: People sitting on conference calls have divided attention, and even the engaged speakers struggle against millisecond time lags and imperfect audio qualities. Meeting a client or a coworker in person offers a sense of connection and often invigorates creative energies that just don’t arise from the other end of a computer monitor.

Lastly, the at-home employee may be highly talented and productive, but their efforts may elude the recognition of management; they’re simply out of sight and out of mind. Companies that are driven to promote from within and foster employee skill sets will have to reinvent their approach to these goals, because they simply are incompatible with teleworking.

Striking the Right Balance

Very few employers foresee a future where the office is a relic, and some expect the five-day in-office workweek to return in full force. Humans are too hard-wired for face-to-face social interactions, but we do value our independence and our free time.

For most employers, the model moving forward is going to be a hybrid of in-office and at-home, in the hopes of taking the best from both worlds. Surveys show a plurality in favor of keeping employees in the office three days a week, but there’s no consensus yet on the best path forward.

What does seem clear is that managers need to spend time with new and inexperienced workers, and the same is true in reverse if those employees want to stand out as fresh company talent. Regardless of the benefits or drawbacks of remote work, it will take time as business leaders and employees alike continue to find their footing in a landscape that hasn’t stopped moving.

Read next: How to Protect Endpoints While Your Employees Work Remotely

The post Has Remote Work Really Benefited Enterprises? appeared first on IT Business Edge.

]]>
Finding Value in Robotic Data Automation https://www.itbusinessedge.com/data-center/finding-value-in-robotic-data-automation/ Fri, 28 Jan 2022 16:22:35 +0000 https://www.itbusinessedge.com/?p=140050 Extracting value from raw data is growing more difficult. Companies are employing RDA to automate stopping points along the data pipeline.

The post Finding Value in Robotic Data Automation appeared first on IT Business Edge.

]]>
Data is the new oil, some say, forming a coveted resource that powers enterprise decision-making. Although, data in its raw form isn’t good for much. It needs to be extracted, refined, and processed—its constituents funneled into various byproducts through pipelines that range from source to refinery to end consumer.

Every bottleneck in that system has an affixed dollar cost. Data that is improperly analyzed for use results in essentially a waste product, and as datasets grow, it has become a more burdensome task to extract the appropriate, most valuable information to funnel downstream.

In recognition of this challenge, a handful of companies have sought to automate stopping points along the data pipeline, a process called Robotic Data Automation, or RDA.

Data Wrangling

Enterprise datasets aren’t just growing, in many cases they’re also becoming real-time. These sets are embodied in a variety of formats and spread across a company’s sprawling IT infrastructure—including on-premises servers, off-premises clouds, and along the edge. 

They require collection, cleanup, validation, extraction, metadata enrichment—an extensive series of steps just to get the data prepped for its intended use. Every step can be time-intensive, and failure at any step can result in invalid outputs. 

RDA aims to automate many of these processes using low-code bots that perform simple, repetitive tasks, with linkages to more complex artificial intelligence (AI) tools, such as IBM Watson, OpenAI, GPT-3, or hundreds of other bots, to execute natural-language processing (NLP) tasks when necessary.

Effectively, a simple machine is designed to cobble together disparate elements, calling on more sophisticated machines when they’re needed, in order to compile raw data into something usable. If executed correctly, automation can help enterprises realize the value of information far more quickly.

RDA tools can also help break up the existing paradigm of data handling, whereby AIOps vendors offer limited, pre-defined sets of tools for customers to interact with their data. These tool sets have limited linkages with other tools, narrower scopes of use cases, and more restrictive data formatting outputs.

Companies like CloudFabrix, Snowflake, and Dremio claim their RDA tools liberate customers from these constraints and include other benefits, such as synthetic data generation; on-the-fly data integrity checks; native AI and machine learning (ML) bots; inline data mapping; and data masking, redaction, and encryption.

Other use cases for RDA tools include:

  • Anomaly Detection: Pulling data from a monitoring tool, comparing historical CPU usage data for a node, then using regression to construct a model that can be sent as an attachment
  • Ticket Clustering: Compiling tickets from a company’s ticket management software, clustering them together, and then pushing the output into a new dataset for visualization on a dashboard of choice
  • Change Detection: Examine virtual machines (VMs) and make comparisons against current states to detect unplanned changes

RDA vs. RPA

Many will be familiar with robotic process automation, or RPA. The older concept carries similarities with RDA in that both aim to simplify common tasks through the use of low-code bots. Where they diverge is that RPA is intended for simplifying common user tasks and workflows, whereas RDA is aimed squarely at the data pipeline.

Although, both RDA and RPA simply mean using simple bots to save time on time-consuming, menial tasks, though with different contexts.

A common example of RPA is a bot empowered with ML capabilities for form completion. The bot monitors how a human repeatedly fills a form until the RPA is trained on the appropriate manner in which the form is to be completed. This type of machine learning is similar to how cellphones can generate predictive text suggestions based on their users’ conversational habits and vocabulary.

Once trained, the bot can take command of form completion, along with other aspects such as submitting the form to its expected targets. While this can expedite the process in the long run, RPA systems can take months to train before their advantages come to fruition.

Also read: Top RPA Tools 2021: Robotic Process Automation Software

RDA’s Long Term Value

There’s always going to be value in automating time-intensive tasks and freeing up human labor for jobs that are more cognitively demanding. As one bottleneck is opened, another will come to take its place. However, the success of these systems like RDA or RPA hinges on their implementations.

Naturally, the tools need to be designed properly to interact with their intended datasets, but enterprises also have a responsibility to properly integrate new tools with their existing data pipelines. AI-driven tools and automation softwares are still in their infancy, still finding new niches to serve, and still being refined in terms of how they deliver service. How RDA shakes up data pipelines is a story yet to be told.

Read next: 6 Ways Your Business Can Benefit from DataOps

The post Finding Value in Robotic Data Automation appeared first on IT Business Edge.

]]>
Bringing Data Democratization to Your Business https://www.itbusinessedge.com/database/bringing-data-democratization-to-your-business/ Fri, 14 Jan 2022 15:30:00 +0000 https://www.itbusinessedge.com/?p=140009 Democratization solves scalability issues that come with IT’s management of an ever-growing information database.

The post Bringing Data Democratization to Your Business appeared first on IT Business Edge.

]]>
While the age of cloud storage has removed the physical barriers between raw data and consumers of information, it hasn’t removed the organizational hurdles. Many believe it’s time for that to change. 

Businesses store increasingly unwieldy volumes of information, much of which can be out of date, inaccurate, or difficult for a layman to understand. The traditional response to managing this mess has been to gatekeep information behind the IT department, where it can be retrieved and processed by request before delivery to its end consumer. 

In recent years, a growing number of organizations has sought to break that model, putting data immediately in the hands of many instead of the control of few. This process, called “data democratization,” seeks to eliminate bottlenecks and facilitate faster decision-making, but its success hinges upon its implementation.

The Benefits of a Healthy Democracy

For decades, company information wasn’t accessible at your fingertips; instead, it was managed by one or more data teams. Under that model, the consumer identifies a need for data and sends their request to IT. The IT team then plans the scope of data retrieval, processes the raw information into a more readily-digestible report, then delivers the end product. Often, if the initial request for data is broad in scope, the response time can be burdensome, particularly if the data team has received a large volume of requests in a short time period.

The lag in request fulfillment can be a hindrance to the execution of data-driven corporate strategies. Data democratization can expedite this process, giving customers control over what data they retrieve, and also giving them full access to raw information rather than a processed product. By empowering consumers with this level of access, the organization at large becomes more nimble and more competitive, assuming all goes well. 

Democratization also solves scalability issues that come with IT’s management of an ever-growing information database. Advocates for democratization also believe that putting information freely in the hands of employees with diverse expertise will result in the organic creation of novel business strategies, creative modeling solutions, and greater data insights.

Also read: 6 Ways Your Business Can Benefit from DataOps

Drawbacks and Challenges

Data democratization requires careful implementation to avoid or mitigate its pitfalls. For instance, if not handled properly, free and open access to information can threaten data integrity and make maintenance roles ambiguous. Proactive steps will need to be taken to establish who is responsible for preserving or modifying data.

A more human concern—and one that pervades all aspects of our lives—is the reluctance of subject matter experts to relinquish complex information to a general population that lacks the training, experience, or context to properly interpret it. At the inverse, many end users may actually prefer to receive heavily condensed and processed information, versus sifting through raw data.

The internet age has, in many ways, democratized data across the world, and the results of that experiment are still pending. However, in an enterprise context, measures can be taken to better posture information consumers for success.

Also read: Best Data Analytics Tools for Analyzing & Presenting Data

Implementation Strategies

There are many methods of democratizing data, most of which are not mutually exclusive. The right mix varies from company to company.

  • Enact strong information governance: Before data can be democratized, guidelines should be established dictating who is responsible for data curation and upkeep.
  • Employee education: At a societal level, education is one of the central pillars holding up healthy democracies. The same is true in the enterprise context, where employees and stakeholders need to understand data access procedures, how the information is to be utilized, and how to use the tools available to them. Just as important, employees should understand the value of these tools in the execution of their regular tasks. For example, Airbnb has created what it calls its “Data University,” which educates its employees on statistics and analysis, problem solving with data, writing SQL, and data visualization. The result empowered nearly half its employees to become weekly regular users of its self-service platform.
  • Embed data analysts: Each team can be assigned its own data analyst, dedicated to the cultivation of information relevant to the team, and elimination of irrelevant data. Furthermore, this analyst can train team members to self-serve.
  • Self service dashboards: Data analysts can train team members to operate a company’s metadata search engine, using metadata management products such as Google’s Data Catalog or the more popular self-service analytics tool, Power BI from Microsoft. Many platforms are designed to aggregate data into intuitive reports on the fly, making information even more accessible to end users.
  • Clearly defined metadata: Healthy, well-structured metadata makes for better search results.

Culture is Key

You can pick all the right tools, implement the perfect system, and train employees to use those tools, but data democratization best shines when employees are primed for independence, self-motivated, and understand the value of the tools at their disposal.

Read next: Top 9 Data Modeling Tools & Software 2021

The post Bringing Data Democratization to Your Business appeared first on IT Business Edge.

]]>