Tag Archives: Meets

Blockchain Meets Life Science: Where Trust Is A Matter Of Life Or Death

281846 offset comp 332114 e1512585466322 Blockchain Meets Life Science: Where Trust Is A Matter Of Life Or Death

Walt Disney, Bill Gates, and Shakespeare have more in common than anyone could imagine, united by the business imperatives embodied in the promise of blockchain technology.

This was just one of the things I learned after tuning into a recent SAP Game-Changers Radio broadcast entitled “Changing the Game in Life Sciences.” Host Bonnie D. Graham adroitly guided three experts through a fascinating exploration of blockchain’s potential to transform the life sciences industry with undreamed-of trust and efficiency for everything from drug discovery and tracking, to patient control of their own data.

Dream it, do it

Peter Ebert, senior vice president of business development and sales at Cryptowerk Corp., had every right to quote Walt Disney’s maxim, “If you can dream it, you can do it.” I saw proof of his company’s co-innovation during a VIDEO interview at SAP TechEd demonstrating a blockchain POC to help the pharmaceutical industry better track drugs. On the radio, Ebert was unsurprisingly optimistic, comparing Disney’s vision for Mickey Mouse in 1928 with blockchain’s potential to change people’s lives.

“Blockchain will not only be a technical technology or technical thing in our lives. It will impact all our experiences,” said Ebert. “If you go to the doctor and you’re getting blood drawn or you’re taking a pill…you want to make sure that this pill is not a counterfeit, that the technology around you and the devices are not counterfeit. Think about the doctor or other people treating you—you want to make sure that they have the education [and] the skills to treat you well and correctly.”

Blockchain’s trust has special significance to #lifescience where digital assets actually mean life or death @SAPRadio 

Ebert thought blockchain’s ability to prove authenticity to any digital asset had special significance to life sciences. “You can infuse this irrefutable trust into your supply chain of digital data assets,” said Ebert. “In life sciences, digital assets actually mean life or death. They’re not just any old assets; they are very precious data that relates to your life, to my life.”

Find blockchain architects for life science

While Deloitte reported 35 percent of surveyed health and life sciences organizations plan to deploy blockchain by 2018, Eric Piscini, principal, financial services practices, injected some caveats. His inspiration was a Bill Gates quote that stated, “We always overestimate the change that will occur in the next two years, and underestimate the change that will occur in the next 10. Don’t let yourself be lulled into inaction.”

“In the next two years we’ll talk about the blockchain, and 10 years from today we will not talk about blockchain anymore because blockchain will be embedded into everything that we do,” said Piscini.

The number-one challenge is finding people who understand both blockchain and life sciences.

“You need someone who understands what blockchain is capable of, the limitations, the challenges, and the opportunities from a technology point of view,” said Piscini. “You also need someone who can understand clinical trials, content management, and adverse effect management from a business point of view, and bring all of that together.”

Love all, trust a few

Joe Miles, global vice president of life sciences at SAP, turned to Shakespeare’s quote “Love all, but trust a few,” to describe how blockchain can deliver trust that helps patients and the medical industry.

“Blockchain is one of the many things that has a capability to really help simplify and automate trust,” he said. “To ensure that the appropriate people are seeing your information or your business information across all the different constituents that you deal with daily in a way that is productive and efficient.”

Miles thinks blockchain can streamline clinical trials, getting lifesaving products to market faster and more safely. “How do we reduce the time from compound to approval? How do we get this in the hands of the patients who need it to save lives all over? It’s expensive, it takes a lot of years,” he said. “Blockchain presents an opportunity to streamline that process to make it more transparent.”

Follow me on TwitterSAP Business Trends, or Facebook. Read all of my Forbes articles here.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

New eBook! DB2 System Tuning and Optimization: Old Meets New

Even thought there have been huge advancements in software and hardware technology, data infrastructure optimization is still critical to enterprises today.

Optimization saves time, resources and money – money that enables organizations to repurpose dollars in flat budgets for desired business initiatives that require additional spend.

Our latest eBook DB2 System Tuning and Optimization: Old Meets New reviews changes in system performance management, SQL query challenges and the future of managing mainframe data.

blog banner eBook Db2 tuning 1 New eBook! DB2 System Tuning and Optimization: Old Meets New

Download the eBook now to start saving time, resources and money!

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

When blockchain meets big data, the payoff will be huge

 When blockchain meets big data, the payoff will be huge

If there is a “sweet spot” for blockchain, it will likely be the ability to turn insights and questions into assets. Blockchains will give you greater confidence in the integrity of the data you see. Immutable entries, consensus-driven timestamping, audit trails, and certainty about the origin of data (e.g. a sensor or a kiosk) are all areas where you will see improvement as blockchain technology becomes more mainstream.

Beyond data integrity (which is a huge component), the shared data layer that blockchains will introduce creates an entirely new set of possibilities for AI capabilities and insights. Trent McConaghy, CTO of BigChainDB does a great job in explaining the benefits of decentralized/
shared control, particularly as a foundation for AI. In this world, he says, you get:
• More data, thus improved modelling capabilities
• Qualitatively new data leading to entirely new models.

The inherent immutability leads to more confidence in training and testing data and the models they produce.

We are also likely to see blockchain-based technology make an impact in the cost of storing data and in the amount (and quality) of data available. Cost savings in data storage will come from the disintermediation of centralized storage providers, thus reducing the “trust tax” you pay them currently. This should also create downward pricing pressure on SaaS suppliers as they move to decentralized storage providers.

You can expect to see decentralized solutions like Storj, Sia, MaidSafe, and FileCoin start to gain some initial traction in the enterprise storage space. [Disclosure: Storj was previously a client of mine.] One enterprise pilot phase rollout indicates this decentralized approach could reduce the costs of storing data by 90 percent compared to AWS.

As for blockchain-driven AI, you can expect to see a three phase roll-out. First, within the existing enterprise. Then, within the ecosystem. Finally, totally open systems. The entire industry might be termed blockchains for big data (McConaghy’s words).

Longer term, we will see an expansion of the concept of big data, as we move from proprietary data silos to blockchain-enabled shared data layers. In the first epoch of big data, power resided with those who owned the data. In the blockchain epoch of big data, power will reside with those who can access the most data (where public blockchains will ultimately defeat private blockchains) and who can gain the most insights most rapidly.

There are two significant implications here:
• Customer data will not belong to organizations, locked away in corporate databases. It will belong to each individual, represented as tokens or coins on an identity blockchain. The customer of the future will grant access to others as necessary.
• Transaction data will be viewable by anyone. Anyone can access the data about the transactions that occur on a given blockchain. (For example, here are the latest Bitcoin transactions.)

When data moves out of proprietary systems onto open blockchains, having the data itself is no longer a competitive advantage. Interpreting the data becomes the advantage. In a blockchain world, all competitors are looking at the same ledger (imagine you and your competitors all have the Google Sheet or Excel file).

Anyone can provide an interface to that ledger. That’s relatively easy. That’s what you see here for Ethereum or zCash.

And many companies will provide applications that enable a customer to interact with a protocol. This is what Jaxx or BitPay do, for Bitcoin.

Yet, there are very few companies that provide a set of analytic capabilities that suck up all of this data and explain what it all means or what should be done about it. Fewer still have figured out the scalable process for doing this. This is the opportunity. Some have called it the “Data Industrialization Opportunity.”

Simply put, it is the question of who can put the best AI/machine learning solution on top of open, shared, blockchain-based data layers. Whoever does that gains some degree of competitive advantage. If a Bitcoin wallet, for example, instead of being “dumb” (as it is now) is actually “smart” — in the sense that it can advise or help customers make sense of the world (based on all the data available on the blockchain) — that one will be market leader.

The world’s top 50 physical mining companies are worth about $ 700 billion. You can expect to see blockchain-based data mining companies that will easily take us into trillions of dollars of market capitalization, although, granted, this may be many years off.

This article is adapted from an excerpt of the author’s new book, The CMO Primer for the Blockchain World.

Jeremy Epstein is CEO of Never Stop Marketing and author of The CMO Primer for the Blockchain World. He currently works with startups in the blockchain and decentralization space. He advises F2000 organizations on the implications of blockchain technology. Previously, he was VP of marketing at Sprinklr from Series A to “unicorn” status.

Let’s block ads! (Why?)

Big Data – VentureBeat

New eBook Available: Mainframe Meets Machine Learning

While we don’t expect robots to take over the world anytime soon, advances in machine learning have started and will continue to strengthen security and power the automation of mainframe operations.

blog banner MachineLearning New eBook Available: Mainframe Meets Machine Learning

Over 80% of data resides within or originates from corporate data centers. When we take into account the sensitivity of the data being stored, the demand for better security of the mainframe increases.

Additionally, the demand for automation of mainframe operations is rising. With an increase of data being created by and stored on the mainframe, along with the loss of skilled mainframe professionals, mainframe maintenance is moving beyond the capabilities of manual operations.

Download our latest eBook, “Mainframe Meets Machine Learning,” to learn about the most difficult challenges and issues facing mainframes today, and how the benefits of machine learning could help alleviate some of these issues.

Let’s block ads! (Why?)

Syncsort blog

AI Meets AML: How the Analytics Work

The focus on financial crime, and the money laundering that funds terrorist attacks and other criminal activities, has forced the industry to look for smarter approaches. In the previous posts in this mini-series, TJ Horan noted that AI is the newest hope for compliance, and Frank Holzenthal explored the benefits that AI can bring to compliance officers.

Now it’s my turn, and I’m going to explore the AI and machine learning technologies my team has integrated into the FICO TONBELLER Anti-Financial Crime Solutions. We have built on top of the FICO TONBELLER solutions using FICO’s battle-proven and patented artificial intelligence and machine-learning algorithms, which are used in FICO Falcon Fraud Manager to protect about two-thirds of the world’s payment card transactions.

Industry experts have begun to realize the significance of analytics in combatting anti-money laundering. For instance, Aite Group LLC in its 2015 report Global AML Vendor Evaluation noted that “increasingly, regulators recognize that rules alone are not an effective manner of detection and are pressuring banks to include more sophisticated analytics.” Our new machine-learning techniques are directed specifically towards real-time, transaction-based KYC anomaly detection and highly refined self-learning models focused on anti-money laundering SAR (suspicious activity report) detection.

As Frank noted in his post, we have integrated two main AI components into our AML products.

Soft-Clustering Behavioral Misalignment Score: Powerful customer segmentation using Bayesian learning

Traditional AML solutions resort to hard segmentation of customers based on the KYC data or sequence of behavior patterns. FICO’s approach recognizes that customers are too complex to be assigned to hard segments and need to be monitored continuously for anomalous behavior.

Using a generative model based on an unsupervised Bayesian learning technique, we take customers’ banking transactions in aggregate and generate “archetypes” of customer behavior. Each customer is a mixture of these archetypes and in real time these archetypes are adjusted with financial and non-financial activity of customers.  We find that using clustering techniques based on the customer’s archetypes allows customer clusters to be formed within their KYC hard segmentation.

Different clusters have different risk, and customers that are not within any cluster are suspicious. As transaction monitoring is applied differently based on KYC, understanding which customers are misaligned within this segment based on their transaction behavior is very important, and transaction rules may not be sufficient. Further, as customers deviate from known clusters of typical behaviors based on transaction archetypes, the Soft Clustering Misalignment score alone can be used to drive investigations and possible SAR filings.

Clustering AI Meets AML: How the Analytics Work

AML Threat Score: Using machine learning to detect and rank-order suspected AML cases

Our AML Threat Score prioritizes investigation queues for SARs, leveraging behavioral analytics capabilities from Falcon Fraud Manager. It uses transaction profiling technology, customer behavior sorted lists (BList), and self-calibrating models that adapt to changing dynamics in the banking environment. Here’s how these work together:

  • Profiles efficiently summarize each customer’s banking transaction history into behavioral analytic features using recursive analytic algorithms, making ultra-low latency, real-time analytics possible.
  • BList maintains a weighted list of the most frequent accounts for a customer, based on regularity and frequency of money transfers to those accounts. Features constructed from BLists act as “fingerprints” of the customer’s account. Changes in BList can indicate suspicious behavior to be investigated. In a data consortium of multiple banks, BList can be used to create global “fingerprints” of good and bad accounts, allowing for quickly identifying fast-changing patterns of malicious activities that couldn’t otherwise be detected.
  • The streaming self-calibrating models are capable of automatically tracking the outlier points for each feature in real time. Then a Multi-Layered Self-Calibrating (MLSC) Score is created by combining multiple outlier models from all the features. MLSC models are structured like a neural network model, where the features in the hidden nodes are selected to minimize correlation. The weights of the model are either expert-driven or based on limited SAR data.

BList AI Meets AML: How the Analytics Work

These capabilities enable the AML Threat Score to analyze transactions in real time, pinpoint even previously unseen money-laundering patterns and stop them before the transactions are executed. The score can also be used to layer on top of current rules-based transaction monitoring to help with prioritization of investigation, to improve the detection and lower false positives.

Multiple ways for adopting analytics using Siron®

FICO’s real-time and dynamic KYC risk assessment and SARs-filing capabilities are unmatched in the market today. Our Siron® product also provides rules-based transaction monitoring capabilities that are relied on by both financial institutions and by regulators.

Siron allows for a gradual integration of analytics into current transaction monitoring. In a “rules first” approach, these novel techniques can be used to reprioritize the rules-based alerts for more effective workload management. A “scores first, rules second” approach can detect new types of AML patterns in real time while potentially reducing the rules set. The ultimate in this value proposition is the adaptive model framework, where FICO’s self-calibrating models are fully leveraged to learn from recent SARs while the rules respond to the regulatory typologies.

FICO has had a history of firsts, from being the first to use analytics for credit application processing systems in the 1950s to being the first to use machine learning for fraud detection in 1991. With our AML analytic capabilities, we continue to be the trailblazers!

Follow me on Twitter @ScottZoldi

To find out more about how FICO is putting AI to work in AML, register for our February 23 webinar, co-presented with CEB TowerGroup, “Hiding in Plain Sight: Is Your KYC Process a Spotlight or a Blindfold? Operational Benefits of New Analytic Technologies.”

Let’s block ads! (Why?)

FICO

Mainframe Meets Machine Learning

While some see the mainframe as a relic, others recognize its place as the workhorse of future computing. The mainframe stands to play a pivotal role in the age of Machine Learning.

The Evolution of the Mainframe

“The mainframe is dying.” I can’t remember when the first time was that I heard people saying that, or how many times I have heard something similar since. Suffice it to say, it was a long time ago and I’ve heard it many times, and as we all know, the mainframe is alive and thriving.

In fact, the mainframe has evolved to ensure its foundational role in new trend after new trend. After going through the era of distributed computing, we are now experiencing the era of mobile computing and cloud computing. And through all this, the mainframe is still there, playing its central role as the hub of customer information and high volume transaction processing at many of the world’s largest companies.

That said, today’s mainframes face new issues and challenges, top of which are security and automation of operations.

Does Mainframe Security Go Far Enough?

Why do I think security is an issue for mainframes? More than 80% of corporate data resides on or originates from mainframes now. And data on mainframe is typically the most valuable data for the enterprises.

Mainframes used to be perceived as being highly secure and immune to attacks and data breaches because they are centralized and somewhat isolated from the rest of the world. In fact, in its recent “State of the Mainframe” survey of 187 IT strategic planners, directors/managers, architects and administrators at global enterprises with $ 1B or more in annual revenues, Syncsort found that 83% of respondents cited security and availability as key strengths of the mainframe respectively. But strangely, this is no longer an accurate perception, times have changed.

blog banner MFSurvey2017 Mainframe Meets Machine Learning

In today’s world of connecting “Big Iron” to “Big Data” for advanced business analytics, mainframes are connected to mobile and IoT devices, clouds and other open systems, and therefore subject to external attacks just like distributed systems.

At the same time, mainframes are also subject to internal attacks such as employee’s malicious intents or negligence. Although z/OS mainframes remain the most secure platform compared to distributed systems, due to the high sensitivity of the data stored on them, their security is always a major concern.

The Challenges Driving Automated Mainframe Operations

Why is automation of operations important for mainframes? In order for mainframe machines to run smoothly and efficiently, a lot of operations are needed every day, including trouble-shooting issues and optimization of resources.

Historically, most of these jobs have been done manually by human operators. This worked fine in the past, but as the number of transactions grows exponentially, the demand on machine operations and maintenance equally increases. At the same time, the population of staff with mainframe skills is shrinking as senior mainframe engineers retire at a rate greater than young mainframers are being hired.   While efforts from groups like SHARE.org are helping, the skill gap keeps expanding. The only foolproof solution to this problem is the automation of the operations of mainframe. Ultimately, automation could lead to self-management of mainframe systems.

A lot of work has been done to address both of the issues described above. Some progress has been made, but not as fast as the increase in complexity we are seeing.

While new security features have been implemented on the mainframe, there have also been new and even more sophisticated threats. And while some operations have been automated, the activity patterns change so quickly that the program is soon outdated.

Apparently, the traditional methods are not fast enough to keep up with the pace. And in some cases, they are even totally incapable. Is there a way out? What if the machine can learn and adapt by itself?

blog MF machineLearning Mainframe Meets Machine Learning
The high value and sensitivity of mainframe data is driving new automated operations and security challenges for IT organizations.

Enter Machine Learning

Machine learning is familiar concept to many. With the victory of AlphaGo, a computer Go program developed by Google DeepMind, against the top professional human Go player, played in Seoul, South Korea between 9 and 15 March 2016, “machine learning” suddenly became a hot topic for the general public.

Although it has the impression of being trendy and cutting-edge, machine learning is actually not a new concept. The early research of machine learning dates back to as early as 1950s, and it has come a long way since then. For many years, machine learning had been more of theoretical studies than practical applications. Only in recent years, backed by new powerful computing technologies, has machine learning started to get into areas that have an impact on our daily lives.

Today, machine learning helps human beings in many different areas. Self-driving cars, computer-aided diagnosis, and the above-mentioned gaming program are just some examples. If you get an email recommending a product that happens to be exactly what you need, it’s very likely there is machine learning behind it. The question remains, in addition to these cool applications, can machine learning help enterprise executives?  Can it help us mainframe engineers?

To learn about what machine learning is and how it can help us, read my full article in Enterprise Executive Magazine.

Let’s block ads! (Why?)

Syncsort blog

Azure Active Directory meets Power BI

Howdy folks,

We’ve heard from many of our largest customers that it’s critically important to them to have easy access to information that helps them understand how their employees and partners are using Azure Active Directory. That understanding allows them to plan their IT infrastructure, to increase usage and maximize the business value they get from Azure AD.

The usage and activity reports in the Azure admin portal are a great starting point for accessing and digesting usage trends. But many of you have told us you want the ability to gather richer insights into what’s going on with the various capabilities you rely on in Azure Active Directory. So, today I am excited to announce the new Power BI Content Pack for Azure Active Directory!

With this integration of Azure Active Directory APIs with Power BI, you can easily download pre-built content packs and dig deeper into all the activities within your Azure Active Directory, and all this data is enhanced by the rich visualization experience Power BI offers. And you can create your own dashboard and share it easily with anyone in your organization.

351a2fd3 1688 458f b9e6 1612464666b9 Azure Active Directory meets Power BI

How to connect to the Azure AD Content Pack

  1. Log into Power BI with your Power BI Account (same account as your O365 or Azure AD Account)
  2. Select Get Data at the bottom of the left navigation pane.

    2be78b71 9567 498a afcb 5ac57f56340f Azure Active Directory meets Power BI

  3. In the Services box, select Get.
  4. Select Azure Active Directory Activity Logs > Get.


    f5cf8a42 b454 4804 aec7 609c1226aa25 Azure Active Directory meets Power BI

  5. When prompted, enter your Azure AD Tenant Name

    bbe630d8 0731 456e 8c98 f07409803b52 Azure Active Directory meets Power BI

  6. For Authentication Method, select Basic > Sign In. Enter the Username and Password to complete your sign-in process

    Important Note:
    We support both basic and Federated authentication. Ensure that the user trying to download the content pack has Security Reader or Admin permissions to access the content pack.

    If you want to choose Federated Authentication, select the Authentication Method as “OAuth2”.

    Federated Logins

    929fb8b6 18d7 47ab 920d 09967f606deb Azure Active Directory meets Power BI
     

    Basic Authentication

    e157a76e 19b4 44d7 a6a0 358f8b2d24f3 Azure Active Directory meets Power BI

  7. Enter your organization username/password for both these types of logins
  8. Power BI will retrieve your Azure AD Activities data and create a ready-to-use dashboard and report.  The reports included in this content pack are

System requirements

In order view to sign-ins logs in the Azure Active Directory Activity content pack, you need Azure AD Premium to access the data. See more details.

Let us know what you think!

Give these new features a try and let the AAD Reporting team know what you think! We read every piece of feedback to make sure the Azure AD administration experience is top-notch, so let us know what works for you and what doesn’t. I look forward to hearing from you!

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Corrugated Sheet Metal Meets Automobile

Quality material  sheers through windshields.

xuUEKHh Corrugated Sheet Metal Meets Automobile

“The driver of this car is really lucky to be alive and uninjured!.”
Image courtesy of http://imgur.com/xuUEKHh.

The Apps They Are A-Changin’—Embedded Analytics Meets Container Technology

rsz bigstock 143049293 The Apps They Are A Changin’—Embedded Analytics Meets Container Technology

The International Space Station (ISS) tips the scales at 925,000 pounds—the equivalent of about 260 Ford Mustangs and about the size of a football field—quietly orbiting above our heads in the darkness of space.

But it wasn’t always that big. The ISS first started as a single module, Zarya, weighing in at a modest 42,600 pounds (only 12 Ford Mustangs). The Zarya module gave the ISS its electrical power, storage, propulsion, and guidance and the rest of the 40 or so modules were added slowly with over 115 space flights. Each module has a specific function and can often be re-configured based on need. In fact, the original Zarya module has been reduced to a storage closet in recent years.

Screen Shot 2016 11 28 at 11.11.40 AM The Apps They Are A Changin’—Embedded Analytics Meets Container Technology

Figure 1 – (Picture Credit Wikipedia)

What does any of this have to do with apps? Apps today are built in the same way as the ISS. Instead of a single monolithic block that performs everything, apps are broken down into collaborative groups of processes that work together to provide complex functionality. The advantage of this modularization (or containerization as it’s called within software) is greater agility to deploy and scale applications.

We know that 96% of applications today have reporting or analytics functionality. Embedded analytics (or embedded BI) is a ripe service for containerization—it needs to integrate with a constellation of applications, services and data, and varying workloads require the ability to flexibly scale a whole environment. In order to have applications that interact and interoperate in the most efficient way, they should be organized like the ISS.

This is why we are happy to announce the Jaspersoft for Docker’s release. Jaspersoft for Docker lets you run the same code on any environment, alongside all of your services as a set of containers and can be scaled from a single developer’s laptop to a swarm of cloud servers.

Screen Shot 2016 11 28 at 11.07.58 AM The Apps They Are A Changin’—Embedded Analytics Meets Container Technology

The containers become our ISS modules, sets of those containers provide services, and those services can be deployed anywhere. It’s a good time to be alive and building apps!

Register for one of our Introducing Jaspersoft for Docker webinars to learn how this technology will change the way you build, release and deploy your embedded BI apps.

Let’s block ads! (Why?)

The TIBCO Blog

Big Data Meets Big Ideas: Top 4 Sessions You Can’t Miss at Splunk .conf2016

This blog is also featured on Splunk’s website.

After hearing stories of Splunk .confs past from my new colleagues at Syncsort, I am excited to attend my first one and share my pre-show research with you today. Splunk is arguably one of the greatest IT success stories of the last decade, thanks to their unique approach and disruptive tech for Big Data analytics (including correlating data from ALL critical data sources). They have taken the IT industry and Wall Street by storm, continually building on their impressive accomplishments. And I’m told they put on a great conference for their users! This year at the 7th annual Splunk Worldwide Users conference (.conf2016), taking place in Orlando, FL from September 26th-29th, they are expecting the largest amount of attendees yet – over 5000 Splunkers! If you plan to attend, let me steer you to a few highlights out of many great sessions and keynotes you can attend, with special emphasis on Wednesday’s great line-up.

Splunk.conf blog 9 26 Big Data Meets Big Ideas: Top 4 Sessions You Can’t Miss at Splunk .conf2016

Re-imagining IT

Wednesday, September 28, 2016 | 9:00 AM-9:45 AM 

Hit Wednesday morning rolling with this thought-provoking keynote session from Johnathon Cervelli, AVP IT Markets Products and Practice, and a former colleague of mine, Andi Mann, who is the Chief Technology Advocate at Splunk. Glimpse the future as well as some amazing new innovations unveiled at this session from two pros with great insights to share.

Transforming Security

Wednesday, September 28, 2016 9:45 AM-10:30 AM

Splunk was named a leader in Gartner’s Security and Event Management (SIEM) Magic Quadrant for the fourth straight year and is positioned furthest overall for completeness of vision. That’s just one reason why you should stay in keynote mode Wednesday morning for this must-see security session led by Hayian Song, Splunk’s SVP, Security Markets, along with Monzy Merza (Splunk’s CSE and Director of Cyber Research) and  Mike Stone, the CIO at the UK Ministry of Defence (MoD). Security and compliance across the entire enterprise (and all geographies) has never been more critical and you’ll want to know what these experts have learned (and how to act on that knowledge).

Splunking” Your z/OS Mainframe

Wednesday, September 28, 2016 4:35 PM-5:20 PM

If your organization has a mainframe, this session is a “must!” Whether your job is to uncover new operational insights, oversee security and compliance, be the point person on IT service intelligence, or just involves critical data, you’ll want to learn about a new paradigm for accessing critical IBM z/OS mainframe system data, correlating it with like open systems’ data, and analyzing this new enterprise-level data set using Splunk. In this session, Syncsort’s Director of Product Management for Ironstream®, Ed Hallock will cover how to now access critical IBM z/OS mainframe system security and IT operational data in Splunk® Enterprise and Splunk® Enterprise Security™ for analysis and an integrated, 360° degree view of an entire enterprise, including the “glass house.”

Introduction to Splunk IT Service Intelligence

Tuesday, September 27, 2016 10:30 AM-11:15 AM

Wednesday, September 28, 2016 1:10 PM-1:55 PM

Splunk is huge in using analytics and machine learning to enhance IT Service Intelligence (ITSI). In this session (delivered twice!), Splunk’s Principal Product Manager Alok Bhide and Staff Architect for IT Operations Analytics, David Millis will address how the status quo of old solutions and approaches can’t handle today’s complex, highly distributed service-oriented architectures. They will show you how to gain service context by combining event and performance data, getting the big picture of your environment, streamlining operations, accelerating root-cause analysis and getting ahead of customer-impacting outages. As the owner of customer insight and Big Iron-to-Big Data product management at Syncsort, I can tell you that this is so important that our team has been working around the clock on a new module to load critical mainframe data into Splunk ITSI and that we’ll have the new “Syncsort Ironstream Module for Splunk ITSI” prototype at our booth. You can add the Syncsort booth and that demo as destination number five and see it and all the Ironstream + Splunk solutions jointly developed over the past 3 years!

OK, those are 4 keynotes and sessions not to miss, so I hope you check them out with me and find them useful.  They are just a few highlights out of an incredible selection of quality content that Splunk is known for at the .conf2016 event and I look forward to seeing you all there!

Let’s block ads! (Why?)

Syncsort blog