Tag Archives: Meets

Mainframe Meets Machine Learning and Artificial Intelligence at SHARE Sacramento

Winter SHARE 2018 was held in Sacramento, California the week of March 12th. Throughout the years the scope and tone of the SHARE conference has changed based upon the technologies everyone is discussing. In the past, it was focused on the mainframe and how to better manage your environment. This year, SHARE was focused on emerging technologies that can be used to manage and monitor your environment. The most talked about next-generation tech being Machine Learning and Artificial Intelligence. As I listened to the sessions, I it became clear that Machine Learning is where many of the attendees were particularly focused.

Mainframe Meets Machine Learning and Artificial Intelligence at SHARE Sacramento banner2 Mainframe Meets Machine Learning and Artificial Intelligence at SHARE Sacramento

A Common Thread between SHARE and Computer Measurement Group

This is the fourth SHARE conference I have presented at with different organizations. For years, I have presented at a number of different conferences, both national and regional. Each conference has its own flavor and target audience. Many of the individuals that go to SHARE would not go to the Computer Measurement Group (CMG) event and vice versa. That said, this year, I came to present a topic that would be of interest at either event… the Changing Landscape of Capacity Management for the Mainframe. The basis of the topic is how Mainframe capacity managers need to ensure they are looking at the process from a business standpoint rather than just as a technical application.

Capacity Management’s Role in Machine Learning

So where does Capacity Management fit into the white hot new trend of Machine Learning? By definition, Machine Learning is “a field of computer science that gives computer systems the ability to “learn” with data, without being explicitly programmed.” Therefore, the key is to provide enough meaningful data for Machine Learning capabilities to automate into analytics. Capacity Management delivers a vast amount of information about IT resources and their utilization, including enabling machine learning programs to perform analytics in the background for the reporting of “Time to Live” until a resource is exhausted. The key to the performance of this analysis is the setting of thresholds, whether those thresholds are static or self-learned.

Self-Learning thresholds is where Capacity Managers want to get to within their environments. One of the discussions that was held at SHARE was focused on whether, with machine learning, we are eliminating the human factor. My answer to that statement is that we can never eliminate the human factor. There are dynamics within the environment that a machine cannot account for in its processing and learning. As we say many times, humans can do things we never have planned for with the software. There will continue to be analysis that must be performed on the reports to ensure a variable is not missed. Syncsort, with software products like Athene and Ironstream with Splunk is using this type of technology to enable this type of machine learning. Over the next twelve months I think this area will be one with large scale growth. The key to making this work is to ensure the quality of the data that is gathered, and having it correlated with the technical and business data. The business and application view is what organizations are looking to display on dashboards for all levels within the enterprise.

One of the benefits of going to a conference is seeing what the vendors are presenting in the exhibition hall along with meeting up with friends and former colleagues from the past. At each conference, you have the stalwarts including Syncsort, BMC, ASG, IntelliMagic, Velocity and IBM. Every year there are some new companies that show up at the shows. Throughout my long career, I have either worked with many people in the exhibition hall or know them from the industry. The nice feature about the technical aspect in the relationships is that we talk about the how our jobs are changing within the industry and the new technologies affecting us.

Conferences such as SHARE continue to provide a great venue for all attendees to share thoughts and insights about what’s changing and what technologies can help address the challenges and opportunities that come with that change.

Download our free eBook, “Mainframe Meets Machine Learning“, to learn about the most difficult challenges and issues facing mainframes today, and how the benefits of machine learning could help alleviate some of these issues.

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

How 4Finance Meets AML Rules in 10 Countries

4finance AML Award How 4Finance Meets AML Rules in 10 Countries

Meeting the separate AML compliance requirements of multiple countries can be a nightmare, not only because of differences in rules but differences in data sources. To meet this challenge, global fintech 4finance, based in the Baltics, turned to FICO, and their results earned them a FICO Decisions Award for regulatory compliance. Here’s what they did.

Meeting the EU ML Directives

Being under the regulation of the EU, 4finance had to comply with the 4th EU ML directive as well as the upcoming 5th EU ML directive. Speed was an issue because the risk of non-compliance was high and their reputation had to be secured.

Complying with the EU’s ML directive means:

  • Identifying politically exposed persons (PEP)
  • Preventing sanction persons and entities from doing business with 4finance
  • Detecting loan applications and businesses related to money laundering
  • Detecting any kind of unusual behavior

From a technical perspective, 4finance faced a distributed environment, with the need to support 10 countries with 20 data source systems.. As a fintech lender, short application cycles had to be considered – 4finance has a goal to issue a loan within 15 minutes.

How FICO TONBELLER Helped

Due to the short timeline for compliance, 4finance decided to use an in-the-cloud implementation of the FICO TONBELLER Siron solution, leveraging Amazon Web Services (AWS).

We defined a project plan to support every phase of the project with senior staff members. These included:

  • A project manager to align with 4Finance on the milestones, achievements and also the escalation process.
  • A technical consultant to work on data interfaces during onboarding – how to integrate Siron®KYC into the existing application process. For transaction monitoring with Siron®KYC, we also defined a data mapping to 4finance’s 20 data systems.

Working closely with 4finance, we overachieved against the deadlines and went live with a cloud-based implementation in less than four months.

Reacting to New Threats

Today 4finance is capable of using the solution without our support. If new money laundering risks show up, 4finance’s team can quickly refine detection scenarios and define new ones. From my point of view, that’s a great success story.

I am proud to be giving a presentation on that at our FICO World 2018 conference in April, together with 4finance. Join us!

Our Siron customers are also invited to listen to 4finance during our annual Siron® User Group Conference in Berlin on March 15-16.

Let’s block ads! (Why?)

FICO

Top 5 Fraud & Security Posts: AI Meets AML (and Hackers)

One of the newer applications of artificial intelligence rose to the top of the Fraud & Security blog last year: anti-money laundering. Readers were also keenly interested in learning more about cybersecurity and ATM compromise trends.

Here were the top 5 posts of 2017 in the Fraud & Security category:

AI AML Top 5 Fraud & Security Posts: AI Meets AML (and Hackers)

As FICO began using AI to detect money laundering patterns, three of our business leaders blogged about why and how AI was being applied. Two of the top posts of the year related to this topic, with TJ Horan explaining why advanced analytics are needed. He noted three ways AI systems improve on traditional AML solutions:

  • More effective than rules-based systems: “As regulations become ever more demanding, the rules-based systems grow more and more complex with hundreds of rules driving know your customer (KYC) activity and Suspicious Activity Report (SAR) filing. As more rules get added, more and more cases get flagged for investigation while false positive rates keep increasing. Sophisticated criminals learn how to work around the transaction monitoring rules, avoiding known suspicious patterns of behavior.”
  • Powerful customer segmentation: “Traditional AML solutions resort to hard segmentation of customers based on the KYC data or sequence of behavior patterns. FICO’s approach recognizes that customers are too complex to be assigned to hard-and-fast segments, and need to be monitored continuously for anomalous behavior.”
  • Rank-ordering of AML alarms: “Using machine learning technology, FICO has also created an AML Threat Score that prioritizes investigation queues for SARs, leveraging behavioral analytics capability from Falcon Fraud Manager. This is a significant improvement, since finding true money-laundering behavior among tens of thousands of SARs is a true needle-in-the-haystack analogy.”

FICO Chief Analytics Officer Dr. Scott Zoldi followed up by explaining two techniques FICO has applied to AML, soft clustering and behavior-sorted lists. Of the first, he wrote:

“Using a generative model based on an unsupervised Bayesian learning technique, we take customers’ banking transactions in aggregate and generate “archetypes” of customer behavior. Each customer is a mixture of these archetypes and in real time these archetypes are adjusted with financial and non-financial activity of customers.  We find that using clustering techniques based on the customer’s archetypes allows customer clusters to be formed within their KYC hard segmentation.”

Clustering Top 5 Fraud & Security Posts: AI Meets AML (and Hackers)

Read the series on AI and AML

Cyber Risk Score 3 Top 5 Fraud & Security Posts: AI Meets AML (and Hackers)

The latest trend in cybersecurity is enterprise security ratings that benchmark a firm’s cybersecurity posture over time, and also against other organizations. Sarah Rutherford explained exactly what these scores measure.

“The cybersecurity posture of an organisation refers to its overall cybersecurity strength,” she wrote. “This expresses the relative security of your IT estate, particularly as it relates to the internet and its vulnerability to outside threats.

“Hardware and software, and how they are managed through policies, procedures or controls, are part of cybersecurity and can be referred to individually as such. Referring to any of these aspects individually is talking about cybersecurity, but to understand the likelihood of a breach a more holistic approach must be taken and an understanding of the cybersecurity posture developed. This includes not only the state of the IT infrastructure, but also the state of practices, processes, and human behaviours. These are harder to measure, but can be reliably inferred from observation.”

Read the full post

ATM Hacked Top 5 Fraud & Security Posts: AI Meets AML (and Hackers)

2017 saw a continued increase in compromised ATMs in the US. As TJ Horan reported:

  • The number of payment cards compromised at U.S. ATMs and merchants monitored rose 70 percent in 2016.
  • The number of hacked card readers at U.S. ATMs, restaurants and merchants rose 30 percent in 2016. This new data follows a 546 percent increase in compromised ATMs from 2014 to 2015.

TJ also provided tips for consumers using ATMs.

Read the full post

Dick Dastardly Top 5 Fraud & Security Posts: AI Meets AML (and Hackers)

With all the focus on data breaches, Sarah Rutherford posed this question, and provided the top three reasons:

  1. For financial gain
  2. To make a political or social point
  3. For the intellectual challenge

“The ‘why’ of cybercrime is complex,” she added. “In addition to the motivations already mentioned, hackers could also be motivated by revenge or wish to spy to gain commercial or political advantage. The different motivational factors for hacking can coincide; a hacker who is looking for an intellectual challenge may be happy to also use their interest to make money or advance their political agenda.”

Read the full post

Follow this blog for our 2018 insights into fraud, financial crime and cybersecurity.

Let’s block ads! (Why?)

FICO

Blockchain Meets Life Science: Where Trust Is A Matter Of Life Or Death

281846 offset comp 332114 e1512585466322 Blockchain Meets Life Science: Where Trust Is A Matter Of Life Or Death

Walt Disney, Bill Gates, and Shakespeare have more in common than anyone could imagine, united by the business imperatives embodied in the promise of blockchain technology.

This was just one of the things I learned after tuning into a recent SAP Game-Changers Radio broadcast entitled “Changing the Game in Life Sciences.” Host Bonnie D. Graham adroitly guided three experts through a fascinating exploration of blockchain’s potential to transform the life sciences industry with undreamed-of trust and efficiency for everything from drug discovery and tracking, to patient control of their own data.

Dream it, do it

Peter Ebert, senior vice president of business development and sales at Cryptowerk Corp., had every right to quote Walt Disney’s maxim, “If you can dream it, you can do it.” I saw proof of his company’s co-innovation during a VIDEO interview at SAP TechEd demonstrating a blockchain POC to help the pharmaceutical industry better track drugs. On the radio, Ebert was unsurprisingly optimistic, comparing Disney’s vision for Mickey Mouse in 1928 with blockchain’s potential to change people’s lives.

“Blockchain will not only be a technical technology or technical thing in our lives. It will impact all our experiences,” said Ebert. “If you go to the doctor and you’re getting blood drawn or you’re taking a pill…you want to make sure that this pill is not a counterfeit, that the technology around you and the devices are not counterfeit. Think about the doctor or other people treating you—you want to make sure that they have the education [and] the skills to treat you well and correctly.”

Blockchain’s trust has special significance to #lifescience where digital assets actually mean life or death @SAPRadio 

Ebert thought blockchain’s ability to prove authenticity to any digital asset had special significance to life sciences. “You can infuse this irrefutable trust into your supply chain of digital data assets,” said Ebert. “In life sciences, digital assets actually mean life or death. They’re not just any old assets; they are very precious data that relates to your life, to my life.”

Find blockchain architects for life science

While Deloitte reported 35 percent of surveyed health and life sciences organizations plan to deploy blockchain by 2018, Eric Piscini, principal, financial services practices, injected some caveats. His inspiration was a Bill Gates quote that stated, “We always overestimate the change that will occur in the next two years, and underestimate the change that will occur in the next 10. Don’t let yourself be lulled into inaction.”

“In the next two years we’ll talk about the blockchain, and 10 years from today we will not talk about blockchain anymore because blockchain will be embedded into everything that we do,” said Piscini.

The number-one challenge is finding people who understand both blockchain and life sciences.

“You need someone who understands what blockchain is capable of, the limitations, the challenges, and the opportunities from a technology point of view,” said Piscini. “You also need someone who can understand clinical trials, content management, and adverse effect management from a business point of view, and bring all of that together.”

Love all, trust a few

Joe Miles, global vice president of life sciences at SAP, turned to Shakespeare’s quote “Love all, but trust a few,” to describe how blockchain can deliver trust that helps patients and the medical industry.

“Blockchain is one of the many things that has a capability to really help simplify and automate trust,” he said. “To ensure that the appropriate people are seeing your information or your business information across all the different constituents that you deal with daily in a way that is productive and efficient.”

Miles thinks blockchain can streamline clinical trials, getting lifesaving products to market faster and more safely. “How do we reduce the time from compound to approval? How do we get this in the hands of the patients who need it to save lives all over? It’s expensive, it takes a lot of years,” he said. “Blockchain presents an opportunity to streamline that process to make it more transparent.”

Follow me on TwitterSAP Business Trends, or Facebook. Read all of my Forbes articles here.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

New eBook! DB2 System Tuning and Optimization: Old Meets New

Even thought there have been huge advancements in software and hardware technology, data infrastructure optimization is still critical to enterprises today.

Optimization saves time, resources and money – money that enables organizations to repurpose dollars in flat budgets for desired business initiatives that require additional spend.

Our latest eBook DB2 System Tuning and Optimization: Old Meets New reviews changes in system performance management, SQL query challenges and the future of managing mainframe data.

blog banner eBook Db2 tuning 1 New eBook! DB2 System Tuning and Optimization: Old Meets New

Download the eBook now to start saving time, resources and money!

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

When blockchain meets big data, the payoff will be huge

 When blockchain meets big data, the payoff will be huge

If there is a “sweet spot” for blockchain, it will likely be the ability to turn insights and questions into assets. Blockchains will give you greater confidence in the integrity of the data you see. Immutable entries, consensus-driven timestamping, audit trails, and certainty about the origin of data (e.g. a sensor or a kiosk) are all areas where you will see improvement as blockchain technology becomes more mainstream.

Beyond data integrity (which is a huge component), the shared data layer that blockchains will introduce creates an entirely new set of possibilities for AI capabilities and insights. Trent McConaghy, CTO of BigChainDB does a great job in explaining the benefits of decentralized/
shared control, particularly as a foundation for AI. In this world, he says, you get:
• More data, thus improved modelling capabilities
• Qualitatively new data leading to entirely new models.

The inherent immutability leads to more confidence in training and testing data and the models they produce.

We are also likely to see blockchain-based technology make an impact in the cost of storing data and in the amount (and quality) of data available. Cost savings in data storage will come from the disintermediation of centralized storage providers, thus reducing the “trust tax” you pay them currently. This should also create downward pricing pressure on SaaS suppliers as they move to decentralized storage providers.

You can expect to see decentralized solutions like Storj, Sia, MaidSafe, and FileCoin start to gain some initial traction in the enterprise storage space. [Disclosure: Storj was previously a client of mine.] One enterprise pilot phase rollout indicates this decentralized approach could reduce the costs of storing data by 90 percent compared to AWS.

As for blockchain-driven AI, you can expect to see a three phase roll-out. First, within the existing enterprise. Then, within the ecosystem. Finally, totally open systems. The entire industry might be termed blockchains for big data (McConaghy’s words).

Longer term, we will see an expansion of the concept of big data, as we move from proprietary data silos to blockchain-enabled shared data layers. In the first epoch of big data, power resided with those who owned the data. In the blockchain epoch of big data, power will reside with those who can access the most data (where public blockchains will ultimately defeat private blockchains) and who can gain the most insights most rapidly.

There are two significant implications here:
• Customer data will not belong to organizations, locked away in corporate databases. It will belong to each individual, represented as tokens or coins on an identity blockchain. The customer of the future will grant access to others as necessary.
• Transaction data will be viewable by anyone. Anyone can access the data about the transactions that occur on a given blockchain. (For example, here are the latest Bitcoin transactions.)

When data moves out of proprietary systems onto open blockchains, having the data itself is no longer a competitive advantage. Interpreting the data becomes the advantage. In a blockchain world, all competitors are looking at the same ledger (imagine you and your competitors all have the Google Sheet or Excel file).

Anyone can provide an interface to that ledger. That’s relatively easy. That’s what you see here for Ethereum or zCash.

And many companies will provide applications that enable a customer to interact with a protocol. This is what Jaxx or BitPay do, for Bitcoin.

Yet, there are very few companies that provide a set of analytic capabilities that suck up all of this data and explain what it all means or what should be done about it. Fewer still have figured out the scalable process for doing this. This is the opportunity. Some have called it the “Data Industrialization Opportunity.”

Simply put, it is the question of who can put the best AI/machine learning solution on top of open, shared, blockchain-based data layers. Whoever does that gains some degree of competitive advantage. If a Bitcoin wallet, for example, instead of being “dumb” (as it is now) is actually “smart” — in the sense that it can advise or help customers make sense of the world (based on all the data available on the blockchain) — that one will be market leader.

The world’s top 50 physical mining companies are worth about $ 700 billion. You can expect to see blockchain-based data mining companies that will easily take us into trillions of dollars of market capitalization, although, granted, this may be many years off.

This article is adapted from an excerpt of the author’s new book, The CMO Primer for the Blockchain World.

Jeremy Epstein is CEO of Never Stop Marketing and author of The CMO Primer for the Blockchain World. He currently works with startups in the blockchain and decentralization space. He advises F2000 organizations on the implications of blockchain technology. Previously, he was VP of marketing at Sprinklr from Series A to “unicorn” status.

Let’s block ads! (Why?)

Big Data – VentureBeat

New eBook Available: Mainframe Meets Machine Learning

While we don’t expect robots to take over the world anytime soon, advances in machine learning have started and will continue to strengthen security and power the automation of mainframe operations.

blog banner MachineLearning New eBook Available: Mainframe Meets Machine Learning

Over 80% of data resides within or originates from corporate data centers. When we take into account the sensitivity of the data being stored, the demand for better security of the mainframe increases.

Additionally, the demand for automation of mainframe operations is rising. With an increase of data being created by and stored on the mainframe, along with the loss of skilled mainframe professionals, mainframe maintenance is moving beyond the capabilities of manual operations.

Download our latest eBook, “Mainframe Meets Machine Learning,” to learn about the most difficult challenges and issues facing mainframes today, and how the benefits of machine learning could help alleviate some of these issues.

Let’s block ads! (Why?)

Syncsort blog

AI Meets AML: How the Analytics Work

The focus on financial crime, and the money laundering that funds terrorist attacks and other criminal activities, has forced the industry to look for smarter approaches. In the previous posts in this mini-series, TJ Horan noted that AI is the newest hope for compliance, and Frank Holzenthal explored the benefits that AI can bring to compliance officers.

Now it’s my turn, and I’m going to explore the AI and machine learning technologies my team has integrated into the FICO TONBELLER Anti-Financial Crime Solutions. We have built on top of the FICO TONBELLER solutions using FICO’s battle-proven and patented artificial intelligence and machine-learning algorithms, which are used in FICO Falcon Fraud Manager to protect about two-thirds of the world’s payment card transactions.

Industry experts have begun to realize the significance of analytics in combatting anti-money laundering. For instance, Aite Group LLC in its 2015 report Global AML Vendor Evaluation noted that “increasingly, regulators recognize that rules alone are not an effective manner of detection and are pressuring banks to include more sophisticated analytics.” Our new machine-learning techniques are directed specifically towards real-time, transaction-based KYC anomaly detection and highly refined self-learning models focused on anti-money laundering SAR (suspicious activity report) detection.

As Frank noted in his post, we have integrated two main AI components into our AML products.

Soft-Clustering Behavioral Misalignment Score: Powerful customer segmentation using Bayesian learning

Traditional AML solutions resort to hard segmentation of customers based on the KYC data or sequence of behavior patterns. FICO’s approach recognizes that customers are too complex to be assigned to hard segments and need to be monitored continuously for anomalous behavior.

Using a generative model based on an unsupervised Bayesian learning technique, we take customers’ banking transactions in aggregate and generate “archetypes” of customer behavior. Each customer is a mixture of these archetypes and in real time these archetypes are adjusted with financial and non-financial activity of customers.  We find that using clustering techniques based on the customer’s archetypes allows customer clusters to be formed within their KYC hard segmentation.

Different clusters have different risk, and customers that are not within any cluster are suspicious. As transaction monitoring is applied differently based on KYC, understanding which customers are misaligned within this segment based on their transaction behavior is very important, and transaction rules may not be sufficient. Further, as customers deviate from known clusters of typical behaviors based on transaction archetypes, the Soft Clustering Misalignment score alone can be used to drive investigations and possible SAR filings.

Clustering AI Meets AML: How the Analytics Work

AML Threat Score: Using machine learning to detect and rank-order suspected AML cases

Our AML Threat Score prioritizes investigation queues for SARs, leveraging behavioral analytics capabilities from Falcon Fraud Manager. It uses transaction profiling technology, customer behavior sorted lists (BList), and self-calibrating models that adapt to changing dynamics in the banking environment. Here’s how these work together:

  • Profiles efficiently summarize each customer’s banking transaction history into behavioral analytic features using recursive analytic algorithms, making ultra-low latency, real-time analytics possible.
  • BList maintains a weighted list of the most frequent accounts for a customer, based on regularity and frequency of money transfers to those accounts. Features constructed from BLists act as “fingerprints” of the customer’s account. Changes in BList can indicate suspicious behavior to be investigated. In a data consortium of multiple banks, BList can be used to create global “fingerprints” of good and bad accounts, allowing for quickly identifying fast-changing patterns of malicious activities that couldn’t otherwise be detected.
  • The streaming self-calibrating models are capable of automatically tracking the outlier points for each feature in real time. Then a Multi-Layered Self-Calibrating (MLSC) Score is created by combining multiple outlier models from all the features. MLSC models are structured like a neural network model, where the features in the hidden nodes are selected to minimize correlation. The weights of the model are either expert-driven or based on limited SAR data.

BList AI Meets AML: How the Analytics Work

These capabilities enable the AML Threat Score to analyze transactions in real time, pinpoint even previously unseen money-laundering patterns and stop them before the transactions are executed. The score can also be used to layer on top of current rules-based transaction monitoring to help with prioritization of investigation, to improve the detection and lower false positives.

Multiple ways for adopting analytics using Siron®

FICO’s real-time and dynamic KYC risk assessment and SARs-filing capabilities are unmatched in the market today. Our Siron® product also provides rules-based transaction monitoring capabilities that are relied on by both financial institutions and by regulators.

Siron allows for a gradual integration of analytics into current transaction monitoring. In a “rules first” approach, these novel techniques can be used to reprioritize the rules-based alerts for more effective workload management. A “scores first, rules second” approach can detect new types of AML patterns in real time while potentially reducing the rules set. The ultimate in this value proposition is the adaptive model framework, where FICO’s self-calibrating models are fully leveraged to learn from recent SARs while the rules respond to the regulatory typologies.

FICO has had a history of firsts, from being the first to use analytics for credit application processing systems in the 1950s to being the first to use machine learning for fraud detection in 1991. With our AML analytic capabilities, we continue to be the trailblazers!

Follow me on Twitter @ScottZoldi

To find out more about how FICO is putting AI to work in AML, register for our February 23 webinar, co-presented with CEB TowerGroup, “Hiding in Plain Sight: Is Your KYC Process a Spotlight or a Blindfold? Operational Benefits of New Analytic Technologies.”

Let’s block ads! (Why?)

FICO

Mainframe Meets Machine Learning

While some see the mainframe as a relic, others recognize its place as the workhorse of future computing. The mainframe stands to play a pivotal role in the age of Machine Learning.

The Evolution of the Mainframe

“The mainframe is dying.” I can’t remember when the first time was that I heard people saying that, or how many times I have heard something similar since. Suffice it to say, it was a long time ago and I’ve heard it many times, and as we all know, the mainframe is alive and thriving.

In fact, the mainframe has evolved to ensure its foundational role in new trend after new trend. After going through the era of distributed computing, we are now experiencing the era of mobile computing and cloud computing. And through all this, the mainframe is still there, playing its central role as the hub of customer information and high volume transaction processing at many of the world’s largest companies.

That said, today’s mainframes face new issues and challenges, top of which are security and automation of operations.

Does Mainframe Security Go Far Enough?

Why do I think security is an issue for mainframes? More than 80% of corporate data resides on or originates from mainframes now. And data on mainframe is typically the most valuable data for the enterprises.

Mainframes used to be perceived as being highly secure and immune to attacks and data breaches because they are centralized and somewhat isolated from the rest of the world. In fact, in its recent “State of the Mainframe” survey of 187 IT strategic planners, directors/managers, architects and administrators at global enterprises with $ 1B or more in annual revenues, Syncsort found that 83% of respondents cited security and availability as key strengths of the mainframe respectively. But strangely, this is no longer an accurate perception, times have changed.

blog banner MFSurvey2017 Mainframe Meets Machine Learning

In today’s world of connecting “Big Iron” to “Big Data” for advanced business analytics, mainframes are connected to mobile and IoT devices, clouds and other open systems, and therefore subject to external attacks just like distributed systems.

At the same time, mainframes are also subject to internal attacks such as employee’s malicious intents or negligence. Although z/OS mainframes remain the most secure platform compared to distributed systems, due to the high sensitivity of the data stored on them, their security is always a major concern.

The Challenges Driving Automated Mainframe Operations

Why is automation of operations important for mainframes? In order for mainframe machines to run smoothly and efficiently, a lot of operations are needed every day, including trouble-shooting issues and optimization of resources.

Historically, most of these jobs have been done manually by human operators. This worked fine in the past, but as the number of transactions grows exponentially, the demand on machine operations and maintenance equally increases. At the same time, the population of staff with mainframe skills is shrinking as senior mainframe engineers retire at a rate greater than young mainframers are being hired.   While efforts from groups like SHARE.org are helping, the skill gap keeps expanding. The only foolproof solution to this problem is the automation of the operations of mainframe. Ultimately, automation could lead to self-management of mainframe systems.

A lot of work has been done to address both of the issues described above. Some progress has been made, but not as fast as the increase in complexity we are seeing.

While new security features have been implemented on the mainframe, there have also been new and even more sophisticated threats. And while some operations have been automated, the activity patterns change so quickly that the program is soon outdated.

Apparently, the traditional methods are not fast enough to keep up with the pace. And in some cases, they are even totally incapable. Is there a way out? What if the machine can learn and adapt by itself?

blog MF machineLearning Mainframe Meets Machine Learning
The high value and sensitivity of mainframe data is driving new automated operations and security challenges for IT organizations.

Enter Machine Learning

Machine learning is familiar concept to many. With the victory of AlphaGo, a computer Go program developed by Google DeepMind, against the top professional human Go player, played in Seoul, South Korea between 9 and 15 March 2016, “machine learning” suddenly became a hot topic for the general public.

Although it has the impression of being trendy and cutting-edge, machine learning is actually not a new concept. The early research of machine learning dates back to as early as 1950s, and it has come a long way since then. For many years, machine learning had been more of theoretical studies than practical applications. Only in recent years, backed by new powerful computing technologies, has machine learning started to get into areas that have an impact on our daily lives.

Today, machine learning helps human beings in many different areas. Self-driving cars, computer-aided diagnosis, and the above-mentioned gaming program are just some examples. If you get an email recommending a product that happens to be exactly what you need, it’s very likely there is machine learning behind it. The question remains, in addition to these cool applications, can machine learning help enterprise executives?  Can it help us mainframe engineers?

To learn about what machine learning is and how it can help us, read my full article in Enterprise Executive Magazine.

Let’s block ads! (Why?)

Syncsort blog

Azure Active Directory meets Power BI

Howdy folks,

We’ve heard from many of our largest customers that it’s critically important to them to have easy access to information that helps them understand how their employees and partners are using Azure Active Directory. That understanding allows them to plan their IT infrastructure, to increase usage and maximize the business value they get from Azure AD.

The usage and activity reports in the Azure admin portal are a great starting point for accessing and digesting usage trends. But many of you have told us you want the ability to gather richer insights into what’s going on with the various capabilities you rely on in Azure Active Directory. So, today I am excited to announce the new Power BI Content Pack for Azure Active Directory!

With this integration of Azure Active Directory APIs with Power BI, you can easily download pre-built content packs and dig deeper into all the activities within your Azure Active Directory, and all this data is enhanced by the rich visualization experience Power BI offers. And you can create your own dashboard and share it easily with anyone in your organization.

351a2fd3 1688 458f b9e6 1612464666b9 Azure Active Directory meets Power BI

How to connect to the Azure AD Content Pack

  1. Log into Power BI with your Power BI Account (same account as your O365 or Azure AD Account)
  2. Select Get Data at the bottom of the left navigation pane.

    2be78b71 9567 498a afcb 5ac57f56340f Azure Active Directory meets Power BI

  3. In the Services box, select Get.
  4. Select Azure Active Directory Activity Logs > Get.


    f5cf8a42 b454 4804 aec7 609c1226aa25 Azure Active Directory meets Power BI

  5. When prompted, enter your Azure AD Tenant Name

    bbe630d8 0731 456e 8c98 f07409803b52 Azure Active Directory meets Power BI

  6. For Authentication Method, select Basic > Sign In. Enter the Username and Password to complete your sign-in process

    Important Note:
    We support both basic and Federated authentication. Ensure that the user trying to download the content pack has Security Reader or Admin permissions to access the content pack.

    If you want to choose Federated Authentication, select the Authentication Method as “OAuth2”.

    Federated Logins

    929fb8b6 18d7 47ab 920d 09967f606deb Azure Active Directory meets Power BI
     

    Basic Authentication

    e157a76e 19b4 44d7 a6a0 358f8b2d24f3 Azure Active Directory meets Power BI

  7. Enter your organization username/password for both these types of logins
  8. Power BI will retrieve your Azure AD Activities data and create a ready-to-use dashboard and report.  The reports included in this content pack are

System requirements

In order view to sign-ins logs in the Azure Active Directory Activity content pack, you need Azure AD Premium to access the data. See more details.

Let us know what you think!

Give these new features a try and let the AAD Reporting team know what you think! We read every piece of feedback to make sure the Azure AD administration experience is top-notch, so let us know what works for you and what doesn’t. I look forward to hearing from you!

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI