Tag Archives: Things

3 Things About Machine Learning Every Marketer Needs to Know

20180117 bnr data nodes shop 351x200 3 Things About Machine Learning Every Marketer Needs to Know

TL;DR: Machine Learning 101: 3 Things Marketers Need to Know

Got data?

I bet you do.

Mountains of data, in fact. Terabytes of data. Libraries worth of data. With more streaming in every hour of every day.

We marketers love our data, but, let’s face it … we probably only use a fraction of the data we collect.

It’s not that we don’t want to use more of it. We do.

It would be fantastic, for example, to follow each and every customer around, to see everything they read, how long they read it for, where they clicked next. You might even want to drop a cookie on their computer and see all the other websites they went to. You could survey them, too, and send them personal messages on social media. Test when is the best time to send them messages, and which channel they respond to best.

Then, with all that wonderful knowledge, you could hole up in your office and design a complete soup-to-nuts marketing strategy just for them.

I’m not talking about something like account-based marketing, where your work is for one big target company. I’m talking about a totally personalized, hand-crafted marketing strategy and execution for every single possible prospect your company could have.

Just think of it: thousands of completely personalized marketing plans. Tens of thousands of personalized messages. Hundreds of thousands of hours poring over the data, studying exactly how each and every single prospect behaves.

That’d be great, right?

Well, if you had unlimited time and unlimited resources, maybe. If you never had to sleep, and had no family and no life … and the assurance that you’d live to be at least 312.

Otherwise … forget it.

Being able to focus that closely and to process every little bit of data we have about our prospects and customers is laughable. Delusional.

We are not machines.

At the most, we only have enough resources to segment our audiences. We have to create personas and buyers journeys based on our best guesses (informed by the data, of course).

But what if machines could do all that?

What if a well-trained algorithm could follow each one of your prospects around and could recommend the perfect piece of content and send it to them at the perfect time, in the channel they’d be most likely to respond to it in? And what if the algorithm could even predict the perfect time for your ace salesperson to finally give them a call?

That’s what machine learning can do.

Here’s what you need to know about it (at least for starters).

Machine learning is a subset of artificial intelligence.

At its simplest definition, machine learning is nothing more than “using data to answer questions.” Hat tip to thank Google’s superb video series on machine learning for that definition.

It’s a specific type ‒ or discipline, if you will ‒ of artificial intelligence. One of its strengths is that a machine learning algorithm’s accuracy can improve over time. It can “learn.” So. while a program that can play chess might be considered artificial intelligence, a program that can learn to play chess, and ping pong, and any other game, would be an example of machine learning.

More complicated machine learning systems are often called “deep learning.” So, for the game example, deep learning systems are set up to use multiple levels – called “neural nets” ‒ to do their processing.

Here’s a Venn diagram to help understand:

Let’s block ads! (Why?)

Act-On Blog

3 Critical Things Sales Can Learn From IT

If you’re in sales, there are people in your organization you want to talk to, and others you may go out of your way to avoid. You might be excited to talk to the CMO — or not so excited — based on the leads you recently worked. You might avoid the people from finance who bother you with questions, but you might enjoy conversing with your comp plan administrator — which would make sense, since many of us have a fondness for the people who hand us our checks.

However, there’s one group of people you may never even consider when you’re thinking about discussions inside your company: the CIO and IT team. Yeah, they’re important when your laptop won’t work, but do they really have anything to say that can impact your performance as a salesperson?

Well, yeah. In this day and age, they do.

If your business is on top of technological developments, you may be the beneficiary of an assortment of game-changing technologies. It behooves you to know what’s coming, when, and how well your team is preparing for it.

Imagine that you’re a Formula 1 race driver, and the CIO is the crew chief who’s preparing your car to be faster that any others on the track. Few drivers would fail to engage their crews to find out the latest changes, the timing of those changes, or other things they should know. Yet sales professionals often seem to strap on new technology and blindly hit the road — and sometimes, they crash and burn.

What do you have to talk about with your technology people? Plenty.

The Infrastructure for Your Artificial Intelligence

Artificial intelligence is coming — not to replace you, but
to serve as an assistant to help you be the best salesperson you can be.

With AI, you don’t just flip the switch and start getting suggestions. AI
must be trained — quite literally — using a very large data set. From there, AI will need a system of storage for the data it generates and the data generated by transactions — in other words, unstructured data about sales performance.

As a salesperson, you should be interested in how this data is stored and managed, since a failure in that regard makes your organization effectively blind — and it turns AI into a liability instead of an asset.

Ask your CIO whether your company is investing in object-based storage. Object storage has the advantage of being limitlessly scalable, and it can integrate new nodes automatically into a single storage namespace.

That means that IT can provision capacity in response to demand rather than trying to predict storage needs and provision capacity that sits unused. The worst-case scenario is that IT fails to provision enough capacity and fails to record a portion of your critical data, which would mean that AI would have to work with a partial picture of the data as it tries to refine its suggestions. That’s a guarantee that AI will never be as intelligent as you need it to be to help close deals.

Suggestion: Get familiar with how well IT is planning to cope with the data tsunami. That is a good indicator of how effective emerging technology will be at elevating your sales game — and it might suggest that it’s time to look for opportunities at organizations with more technology foresight.

The Depth of Your Internet of Things Offering

While the IoT promises big benefits to buyers and sellers, it also requires an enormous amount of trust. Allowing a vendor to install systems that deliver a constant data stream about your business operations is not something to be agreed to lightly — it requires the vendor to have all of its policies nailed down, and to hold all of its employees to an extremely high degree of integrity. If you’re in sales, you have to
sell this relationship as part of the deal.

IT needs to think through data policies thoroughly, and it needs to make sure that activities triggered by data in an IoT relationship completely map to the business process. It’s not enough to send replacement parts automatically or schedule maintenance proactively. The behind-the-scenes activities around contracts, invoicing, commissions and quote generation need to be hooked into this system, too.

Suggestion: Talk to IT to see whether the background activities needed to deliver IoT are part of IT’s plans, and get a timeline for the complete integration of business processes into the IoT infrastructure.

You may be pleasantly surprised by a comprehensive plan that lays the groundwork for great customer experiences and lucrative, long-term customer lifecycles, and that is something you can sell with complete confidence. On the other hand, you may be chagrined to find an IoT infrastructure that’s only partially baked, which could result in sales that turn into contractual nightmares and angry buyers in the near future.

Can You Actually Analyze Your Data?

For all the excitement over the last five years about sales analytics, it’s still often a battle to put the right data sets together to find actionable insights. Data from disparate cloud applications may not be easy to use for a number of reasons. There might be API mismatches between cloud providers. Also, different sales support systems may not work together. Finally, the unmanaged deployment of cloud applications for sales and marketing can create silos where data can be hidden.

That means that complete analysis can require a lot of work in preparing the data before any numbers can be run. That will take IT time — time the team often doesn’t have, and waiting for IT could result in stale analysis that has less impact than it ought to have provided.

Suggestion: Find out from IT how connected your various data sources actually are. That will help you understand what you know and what you could know, and it will put you in a much better place to make suggestions about next steps for data integrations.

Also, learn from IT how new cloud-based applications deployed without any thought about integration impact analysis. That will help remind you that
the IT team should be part of any cloud application decision if you want to maximize data’s impact on sales results.
end enn 3 Critical Things Sales Can Learn From IT


Chris%20Bucholtz 3 Critical Things Sales Can Learn From ITChris Bucholtz has been an ECT News Network columnist since 2009. His focus is on CRM, sales and marketing software, and the interface between people and technology. A noted speaker and author, Chris has covered the CRM space for 10 years.
Email Chris.

Let’s block ads! (Why?)

CRM Buyer

3 Critical Things Sales Can Learn From IT

If you’re in sales, there are people in your organization you want to talk to, and others you may go out of your way to avoid. You might be excited to talk to the CMO — or not so excited — based on the leads you recently worked. You might avoid the people from finance who bother you with questions, but you might enjoy conversing with your comp plan administrator — which would make sense, since many of us have a fondness for the people who hand us our checks.

However, there’s one group of people you may never even consider when you’re thinking about discussions inside your company: the CIO and IT team. Yeah, they’re important when your laptop won’t work, but do they really have anything to say that can impact your performance as a salesperson?

Well, yeah. In this day and age, they do.

If your business is on top of technological developments, you may be the beneficiary of an assortment of game-changing technologies. It behooves you to know what’s coming, when, and how well your team is preparing for it.

Imagine that you’re a Formula 1 race driver, and the CIO is the crew chief who’s preparing your car to be faster that any others on the track. Few drivers would fail to engage their crews to find out the latest changes, the timing of those changes, or other things they should know. Yet sales professionals often seem to strap on new technology and blindly hit the road — and sometimes, they crash and burn.

What do you have to talk about with your technology people? Plenty.

The Infrastructure for Your Artificial Intelligence

Artificial intelligence is coming — not to replace you, but
to serve as an assistant to help you be the best salesperson you can be.

With AI, you don’t just flip the switch and start getting suggestions. AI
must be trained — quite literally — using a very large data set. From there, AI will need a system of storage for the data it generates and the data generated by transactions — in other words, unstructured data about sales performance.

As a salesperson, you should be interested in how this data is stored and managed, since a failure in that regard makes your organization effectively blind — and it turns AI into a liability instead of an asset.

Ask your CIO whether your company is investing in object-based storage. Object storage has the advantage of being limitlessly scalable, and it can integrate new nodes automatically into a single storage namespace.

That means that IT can provision capacity in response to demand rather than trying to predict storage needs and provision capacity that sits unused. The worst-case scenario is that IT fails to provision enough capacity and fails to record a portion of your critical data, which would mean that AI would have to work with a partial picture of the data as it tries to refine its suggestions. That’s a guarantee that AI will never be as intelligent as you need it to be to help close deals.

Suggestion: Get familiar with how well IT is planning to cope with the data tsunami. That is a good indicator of how effective emerging technology will be at elevating your sales game — and it might suggest that it’s time to look for opportunities at organizations with more technology foresight.

The Depth of Your Internet of Things Offering

While the IoT promises big benefits to buyers and sellers, it also requires an enormous amount of trust. Allowing a vendor to install systems that deliver a constant data stream about your business operations is not something to be agreed to lightly — it requires the vendor to have all of its policies nailed down, and to hold all of its employees to an extremely high degree of integrity. If you’re in sales, you have to
sell this relationship as part of the deal.

IT needs to think through data policies thoroughly, and it needs to make sure that activities triggered by data in an IoT relationship completely map to the business process. It’s not enough to send replacement parts automatically or schedule maintenance proactively. The behind-the-scenes activities around contracts, invoicing, commissions and quote generation need to be hooked into this system, too.

Suggestion: Talk to IT to see whether the background activities needed to deliver IoT are part of IT’s plans, and get a timeline for the complete integration of business processes into the IoT infrastructure.

You may be pleasantly surprised by a comprehensive plan that lays the groundwork for great customer experiences and lucrative, long-term customer lifecycles, and that is something you can sell with complete confidence. On the other hand, you may be chagrined to find an IoT infrastructure that’s only partially baked, which could result in sales that turn into contractual nightmares and angry buyers in the near future.

Can You Actually Analyze Your Data?

For all the excitement over the last five years about sales analytics, it’s still often a battle to put the right data sets together to find actionable insights. Data from disparate cloud applications may not be easy to use for a number of reasons. There might be API mismatches between cloud providers. Also, different sales support systems may not work together. Finally, the unmanaged deployment of cloud applications for sales and marketing can create silos where data can be hidden.

That means that complete analysis can require a lot of work in preparing the data before any numbers can be run. That will take IT time — time the team often doesn’t have, and waiting for IT could result in stale analysis that has less impact than it ought to have provided.

Suggestion: Find out from IT how connected your various data sources actually are. That will help you understand what you know and what you could know, and it will put you in a much better place to make suggestions about next steps for data integrations.

Also, learn from IT how new cloud-based applications deployed without any thought about integration impact analysis. That will help remind you that
the IT team should be part of any cloud application decision if you want to maximize data’s impact on sales results.
end enn 3 Critical Things Sales Can Learn From IT


Chris%20Bucholtz 3 Critical Things Sales Can Learn From ITChris Bucholtz has been an ECT News Network columnist since 2009. His focus is on CRM, sales and marketing software, and the interface between people and technology. A noted speaker and author, Chris has covered the CRM space for 10 years.
Email Chris.

Let’s block ads! (Why?)

CRM Buyer

5 Things Everyone Should Know about Mainframes

What comes to mind when you hear the word mainframe? If it’s a computer from the 1950s, you probably don’t know as much about mainframes as you could. Keep reading for five little-known mainframe facts.

blog mainframe information 5 Things Everyone Should Know about Mainframes

Whether or not you work with the mainframe, we’ve got 5 mainframe facts that everyone should know!

If you work in the mainframe industry or have a strong background in technology, these mainframe facts may not be little-known to you. But to the public at large, they are.

1. Mainframes are the Size of a Fridge

The mainframes that filled entire rooms went out of style decades ago.

Today’s mainframes are about the size of a refrigerator. That makes them somewhat larger than your average server, sure, and certainly bigger than your personal computer.

But a fridge is certainly smaller than the gigantic mainframes of old.

2. Mainframes Can Run Windows

You may know that mainframes can run Linux in addition to native mainframe operating systems, such as z/OS. But did you know mainframes support Microsoft Windows, too?

Well, OK – not all mainframes can run Windows. And they can’t run any version of Windows.

But Windows is supported on certain mainframes. That makes a lot of sense if you think about it. Mainframes are powerful machines that can be more cost-efficient to maintain than a data center’s worth of servers. If you can run Windows workloads on your mainframes in addition to Linux and z/OS, then you can use them to consolidate your entire data center.

blog banner SotMF 2018 5 Things Everyone Should Know about Mainframes

3. Mainframes Host about 70 Percent of the World’s Data

Ever wonder where all the big data you hear so much about actually lives?

In many cases, it’s on a mainframe. Mainframes store about 70 percent of the total data in the world.

If you think about which industries still rely heavily on mainframes, this makes sense. Banking, insurance, retail and the like are industries that produce a lot of data. They’re also ones that tend to be powered by mainframe computers.

4. COBOL Remains in High Demand

Industries that rely heavily on mainframes also depend on apps written in COBOL, the venerable mainframe programming language.

Almost no one learns COBOL these days. You’d be hard-pressed to find a college computer science program that includes instruction in it.

Yet the fact is that COBOL skills are still in demand. And precisely because younger COBOL programmers are in short supply, businesses that depend on COBOL are willing to pay top dollar for programmers who can provide COBOL expertise.

5. A Single Mainframe Equals Hundreds of Regular Servers

How many regular, x86-based servers does it take to equal the computing power of a mainframe? About 1,500 – if you’re talking about IBM’s System z10 mainframe, which was released in 2008.

That data is a little dated, but it still provides a sense of the enormous computing power housed in a single mainframe.

Conclusion

Now you know what modern mainframes really look like and can do – and how much they have changed since the days of the first mainframes.

The malevolent mainframes depicted in films like Alphaville never came to pass, but mainframes certainly have grown much smarter, leaner, meaner and more powerful over the decades

Learn more about the five key mainframe trends to watch in the coming year by reading about the results of our fourth annual survey report:  State of Mainframe for 2018

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Trending Now: The Internet of Things – From Big Idea to Big Data and Big Risk

It’s that time of year that finds many of us perusing annual “what to buy” lists for holiday gift ideas. As we review these editorial picks across a spectrum of recipients, it’s a reminder of how the Internet of Things (IoT) has become so embedded in our everyday experiences.

A recent article on Engineering.com points to the astonishing growth of the Internet of Things, as IoT devices have now outpaced humans. “Research company Gartner forecasts that we’ll see 8.4 billion IoT devices in use in 2017, one for each of the 7.5 billion people currently inhabiting the planet (with almost a billion to spare).” And this is really just the beginning. That number is expected to more than double by 2020.

An Internet of Things Timeline

Although the topic goes back decades, our conversation around the Internet of Things (IoT) began in 2014 with a blog post that correctly predicted that Big Data was about to get even bigger. Although the Nest thermostat product premiered in 2011, it was in 2014 that Google acquired it for over $ 3 billion that signified much more was to come.

Last year at this time we reviewed the actual adoption rates of IoT. One interesting statistic from this post, “A summer 2016 survey of retailers and logistics providers reported that 64 percent were already using IoT technology.”

blog nest app Trending Now: The Internet of Things – From Big Idea to Big Data and Big Risk

IoT’s Connection to Artificial Intelligence

Building and developing for the IoT is more than just programming and empowering devices with interactive features. Data from IoT sensors is also strongly tied to artificial intelligence. These devices gather information about their users, tracking their habits and preferences. The devices then leverage those preferences to smartly provide necessary services according to their users’ data-generated lifestyles.

The Internet of Things = More Big Data

The IoT and Big Data did not spring from the same shoot. They are distinct disciplines. But if you look at the history of each of these fields, you notice that they have some interesting features in common. Read more in our post The IoT-Big Data Convergence.

blog banner White Paper Accessing MF app data Trending Now: The Internet of Things – From Big Idea to Big Data and Big Risk

During our interview with Jim Cahill of Emerson Automation Solutions, he remarked on how the Industrial Internet of Things technology is opening-up new business models and ways manufacturers and suppliers interact. For example, equipment such as sensing devices might be leased with data from these devices (very much Big Data) sent to robust analytical software and reviewed by experts to spot abnormal or inefficient conditions early.

As Decideo’s Philippe Nieuwbourg describes in an interview earlier this year, “IoT is fantastic. Humans can generate data, but they have limits that sensors don’t have. Objects, sensors, can generate thousands of data every minute. It’s an inexhaustible data source.”

Collecting and analyzing data from sensors, smart devices and other data sources on the Internet of Things (IoT) will become increasingly important as the IoT expands. Kaa was featured in our Beyond Hadoop: 5 Open Source Big Data Projects You May Have Missed post earlier this year. Its open source platform aims to give developers one-stop shopping for writing software for IoT application.

IoT’s Connection to Artificial Intelligence

Building and developing for the IoT is more than just programming and empowering devices with interactive features. Data from IoT sensors is also strongly tied to artificial intelligence. These devices gather information about their users, tracking their habits and preferences. The devices then leverage those preferences to smartly provide necessary services according to their users’ data-generated lifestyles. (Related: Trending Now: Machine Learning Has Arrived)

blog code security Trending Now: The Internet of Things – From Big Idea to Big Data and Big Risk

Increased Security Risks

As the legendary words of Uncle Ben forewarn, “With great power comes great responsibility.” With more networked devices, there are more opportunities for hackers to attack. The expansion of the IoT is breeding a whole new generation of security risks to critical infrastructure.

Security was one of our 6 Things IT Needs to Know About the IoT and also named as one of the 4 Skills You Need to Delve into the Internet of Things.

In our interview with Robert Corace of SoftServe, he discussed security being a chief concern and top challenge for his clients, but worth the risk. “I wouldn’t describe these as purely challenges, though, as these companies also stand to gain a lot. Digital asset management, Cloud computing, mobile technologies, and the Internet of Things (IoT) approached as a part of digital transformation efforts can bring a lot of benefits to consumer facing operations, retail, the finance and banking sector, and many others.”

Syncsort’s Big Data solutions offer ways to help you harness the power of the Internet of Things. Read the whitepaper Accessing and Integrating Mainframe Application Data with Hadoop and Spark to learn about the architecture and technical capabilities that make Syncsort DMX-h the best solution for accessing the most complex application data from mainframes and integrating that data using Hadoop.

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Five Things I Learned From “The Last Jedi” Trailer

277357 l srgb s gl 300x200 Five Things I Learned From “The Last Jedi” Trailer“Innovation distinguishes between a leader and a follower.” – Steve Jobs

As a part of the last wave of Millennials joining the workforce, I have been inspired by Jobs’ definition of innovation. For years, Millennials like me have been told that we need to be faster, better, and smarter than our peers. With this thought in mind and the endless possibilities of the Internet, it’s easy to see that the digital economy is here, and it is defining my generation.

Lately we’ve all read articles proclaiming that “the digital economy and the economy are becoming one in the same. The lines are being blurred.” While this may be true, Millennials do not see this distinction. To us, it’s just the economy. Everything we do happens in the abstract digital economy – we shop digitally, get our news digitally, communicate digitally, and we take pictures digitally. In fact, the things that we don’t do digitally are few and far between.

Millennial disruption: How to get our attention in the digital economy

In this fast-moving, highly technical era, innovation and technology are ubiquitous, forcing companies to deliver immediate value to consumers. This principle is ingrained in us – it’s stark reality. One day, a brand is a world leader, promising incredible change. Then just a few weeks later, it disappears. Millennials view leaders of the emerging (digital) economy as scrappy, agile, and comfortable making decisions that disrupt the norm, and that may or may not pan out.

What does it take to earn the attention of Millennials? Here are three things you should consider:

1. Millennials appreciate innovations that reinvent product delivery and service to make life better and simpler.

Uber, Vimeo, ASOS, and Apple are some of the most successful disruptors in the current digital economy. Why? They took an already mature market and used technology to make valuable connections with their Millennial customers. These companies did not invent a new product – they reinvented the way business is done within the economy. They knew what their consumers wanted before they realized it.

Millennials thrive on these companies. In fact, we seek them out and expect them to create rapid, digital changes to our daily lives. We want to use the products they developed. We adapt quickly to the changes powered by their new ideas or technologies. With that being said, it’s not astonishing that Millennials feel the need to connect regularly and digitally.

2. It’s not technology that captures us – it’s the simplicity that technology enables.

Recently, McKinsey & Company revealed that “CEOs expect 15%–50% of their companies’ future earnings to come from disruptive technology.” Considering this statistic, it may come as a surprise to these executives that buzzwords – including cloud, diversity, innovation, the Internet of Things, and future of work – does not resonate with us. Sure, we were raised on these terms, but it’s such a part of our culture that we do not think about it. We expect companies to deeply embed this technology now.

What we really crave is technology-enabled simplicity in every aspect of our lives. If something is too complicated to navigate, most of us stop using the product. And why not? It does not add value if we cannot use it immediately.

Many experts claim that this is unique to Millennials, but it truly isn’t. It might just be more obvious and prevalent with us. Some might translate our never-ending desire for simplicity into laziness. Yet striving to make daily activities simpler with the use of technology has been seen throughout history. Millennials just happen to be the first generation to be completely reliant on technology, simplicity, and digitally powered “personal” connections.

3. Millennials keep an eye on where and how the next technology revolution will begin.

Within the next few years Millennials will be the largest generation in the workforce. As a result, the onslaught of coverage on the evolution of technology will most likely be phased out. While the history of technology is significant for our predecessors, this not an overly important story for Millennials because we have not seen the technology evolution ourselves. For us, the digital revolution is a fact of life.

Companies like SAP, Amazon, and Apple did not invent the wheel. Rather, they were able to create a new digital future. For a company to be successful, senior leaders must demonstrate a talent for R&D genius as well as fortune-telling. They need to develop easy-to-use, brilliantly designed products, market them effectively to the masses, and maintain their product elite. It’s not easy, but the companies that upend an entire industry are successfully balancing these tasks.

Disruption can happen anywhere and at any time. Get ready!

Across every industry, big players are threatened — not only by well-known competitors, but by small teams sitting in a garage drafting new ideas that could turn the market upside down. In reality, anyone, anywhere, at any time can cause disruption and bring an idea to life.

Take my employer SAP, for example. With the creation of SAP S/4HANA, we are disrupting the tech market as we help our customers engage in digital transformation. By removing data warehousing and enabling real-time operations, companies are reimagining their future. Organizations such as La Trobe University, the NFL, and Adidas have made it easy to understand and conceptualize the effects using data in real time. But only time will tell whether Millennials will ever realize how much disruption was needed to get where we are today.

Find out how SAP Services & Support you can minimize the impact of disruption and maximize the success of your business. Read SAP S/4HANA customer success stories, visit the SAP Services HUB, or visit the customer testimonial page on SAP.com.

Comments

Let’s block ads! (Why?)

Digitalist Magazine

4 Sexy Things You Need to Know About Machine Learning

I consider myself a sensible person with good time management skills, so it’s always beguiling to get to the bottom of a news story on a respectable site to be presented with an array of clickbait with alluring titles like “20 ways to stop hangovers your doctor won’t tell you” (made up by me) and “Architects took this water park too far” (real and no they didn’t – it’s just a big slide).

It got me wondering what a clickbait article for machine learning might look like – and then I realised that many machine learning terms are pure clickbait without any augmentation. Test yourself. Would you click on these headlines?

Sensational Details of Model Over-Training in Action

Machine Learning Overtrain 300x260 4 Sexy Things You Need to Know About Machine Learning

No, not that sort of model!

Legendary running coach Arthur Lydiard famously taught “Train don’t strain” and it is important to avoid the temptation to over-train or “overfit” a machine learning model.

It can be intuitive to train a model to the point where it exactly fits with the training data, but that will actually make it less predictive. In real terms, if you are building a supervised machine learning model to try to detect fraud, it is important to avoid the temptation to tweak it until it finds every tagged fraud in your data set.

You want your model to be effective on transactions it hasn’t seen. Overtraining gives too much focus on the noise in a particular sampled data set and not enough on the pattern you are trying to detect.

Knowing how far to train a model requires skill and experience. Done right it produces fantastic models which deliver great results both on paper and in production when they’re truly tested. Be wary of systems that advise retraining models all the time. They’re almost guaranteed to learn unimportant and irrelevant patterns and block swathes of good activity.

You Won’t Believe Details of these Unsupervised Relationships

Machine Learning Relationship 300x226 4 Sexy Things You Need to Know About Machine Learning

No, not that sort of relationship!

One area of machine learning we at FICO find particularly exciting and have been researching for more than a decade is unsupervised techniques. They help solve a range or problems where prior “tagged” data is not available, or where algorithms need to find new patterns, sometimes on data that an algorithm or model might never had seen before.

Unsupervised techniques such as multi-layer self-calibrating outlier models — used in FICO’s Cyber Security solution — give enterprises the tools to monitor their networks and detect hacking attempts before data or IP are compromised. And if a compromise has happened, unsupervised techniques help protect the organisations that are then assaulted with the stolen identities.

PSD2 — arriving over the next two years — is another great opportunity, particularly while there is a lack of training data.

The Secret Reality of Living in a Random Forest

Machine Learning Random Forest 300x225 4 Sexy Things You Need to Know About Machine Learning

No, not that sort of forest!

There’s a lot of hype specifically about the efficacy of random forest models. These models work by growing a forest of slightly different decision trees and then seeing which responds better to your training data.

Whole academic theses (printed on actual forests) have and will be written about how good or bad the random forest approach is compared to other machine learning models. It has emerged as a popular technique — maybe partly because it is easier to comprehend than some of the other methods.

One problem is that random forests are inherently not explainable and that presents a real challenge with the introduction of GDPR next year. The legislation dictates, among other things, that enterprises must be able to explain their decisions. FICO have a number of approaches to add explainability to random forest and other machine learning models.

There is no single machine learning approach that will solve all problems, so it’s important to choose the right technique or set of techniques. Explainability is and will be important in many applications. Other considerations are speed, computational processing needs, storage needs and maintainability. FICO has experience of deploying machine learning at scale with applications such as FICO Falcon Fraud Manager capable of processing thousands of transactions a second with 10-20 ms latency.

The Sexy Features They Didn’t Want You to See

Machine Learning Features 300x202 4 Sexy Things You Need to Know About Machine Learning

No, not those sorts of features!

Features are the lifeblood of predictive machine learning. They’re the parameters in the algorithm. When designing a model, the data scientist identifies features which might help solve the problem at hand and then they test their theory — again and again. Features can be simple, like the amount of a transaction, or they can be complex, such as composite calculations of network and device information.

Andrew Ng put it powerfully: “Coming up with features is difficult, time-consuming, requires expert knowledge. ‘Applied machine learning’ is basically feature engineering.”

So maybe features aren’t sexy, but machine learning and data science definitely are. The Harvard Business Review thinks it’s the sexiest job of the 21st century. The reason for that is that machine learning is transformational and the data scientists who are part of the transformation will be those that combine domain expertise, precision and determination with creativity and originality.

Let’s block ads! (Why?)

FICO

3 Things a Well-Architected Salesforce Integration Can Do for Your Organization

rsz bigstock missing jigsaw puzzle piece wi 88882046 3 Things a Well Architected Salesforce Integration Can Do for Your Organization

Did you know that with a great Salesforce integration you can automate business processes, cut down on human error, and get a full view of your customers? All you need is a well integrated Salesforce system that enables data sharing across your entire network.  

The reality is, not all integration approaches are created equal. Some only enable data to be moved in one direction between systems. Some appear easy at the onset, but require considerable coding. Others only offer point-to-point integration and can only tie one system to another. And still others don’t act as a platform where data can be easily shared across the organization.

Businesses that choose the right integration solution will be rewarded with speedier, cost-effective integrations that can create unique competitive advantages to put them ahead of the curve. In our new whitepaper “Nine Strategies for a Successful Salesforce Integration”, we’ll walk you through how to pick the best integration solution for your company.

The great thing is, you don’t have to be a technical expert to understand and execute Salesforce integration. As long as you can grasp the concept of information moving from one system to another, you’ve got it.

In today’s digital world, customers, employees, and partners expect anytime, anywhere communications, instant responses, and up-to-date/real-time information. The battle for business will be won by being able to support business requests faster and better. To surpass the competition, enable your systems to work together at top speed. Read our paper to get started today.

Let’s block ads! (Why?)

The TIBCO Blog

The U.S. government should be making better use of the internet of things

 The U.S. government should be making better use of the internet of things

The U.S. government acknowledges that the Internet of things (IoT) has advanced past the research and development stage. However, even with this acknowledgement, there is still a gap in how the public sector is connecting to the IoT and how it should be connecting.

Imagine if agencies could arm flood zones with sensors able to detect rising water levels and provide early warning to citizens. Or deploy sensors on highways that alert the necessary authorities when they are iced over. Or connect irrigation systems in national parks to smartphone applications that stream scheduling and maintenance data.

Policy makers, for their part, should be taking steps to update security, data governance, and regulations that enable the swift, full adoption of IoT and associated technology.

However, while some agencies, like NIST, already have established programs in place, IoT adoption among public sector agencies is in the very early stages due to extreme shortages in security, talent and logistics. Here are a few ways the public sector can swiftly overcome these hurdles:

Build secure endpoints and consider blockchain integration

A new bill introduced in early August by Senator Mark Warner (D-Virginia) and Senator Cory Gardner (R-Colorado) attempts to set standards for both public and private sector IoT-linked devices. Meanwhile, government technologists are taking extra time to explore secure hardware options and survey connectable platforms, from the Pentagon to the famously breached Office of Personnel Management. For good reason, the public sector must take a security-first mindset when designing solutions that employ the IoT. We all remember the botnet that broke the Internet, so we must design systems and solutions with military-grade cyber-defense capabilities.

To fully address IoT security concerns, as well as ensure completeness and integrity amongst entities, agencies need to be thinking about blockchain integration. Just as blockchain can be used in the healthcare sector to track medical records securely and in real-time so that everyone involved in a patient’s care has the same accurate view of the patient’s situation, government agencies that famously manage data records, like OPM, could use blockchain to avoid future breaches.

Recruit talent that is familiar with the technology

The public sector needs to complement IoT research and development with targeted talent recruitment. Specifically, agencies need to develop IoT-centered programs and incentives that attract data scientists to analyze trends and guide implementation, UX teams that can match a complex network with a seamless experience for people working with data, and field service professionals who can keep tabs on the performance of sensors attached to millions of points in the public sector’s network.

In practice, a typical IoT deployment requires engineers who install and activate sensors in warehouses, farms, and vehicles. McKinsey points out that there are also many sensors in place that could be connected to the IoT but are inactive. As a first step, the public sector will need to onboard trained engineers who can take stock of the government’s connectivity potential. Once the sensors are deployed, data from the sensors needs to be collected and analyzed in a central data repository by software developers and data scientists. Finally, the results of the data analysis need to be presented to government stakeholders so they can take action on whatever the IoT devices are telling them. Hence, the need for user experience experts.

Lay the foundation for a connected public sector

Large-scale government IoT projects will require significant Other Direct Costs (ODC) to procure sensors and activate (or reactivate) infrastructure. And current regulations pertaining to data capture and information assurance need to be refocused to provide firm controls and clear guidelines for IoT deployment. Agencies also need to recruit – and retain – the types of staff I’ve mentioned above.

Despite these roadblocks, the benefits of public sector IoT are substantial, particularly as it relates to emergency management and planning.

IoT, when implemented correctly, has the power to solve problems before they even happen at all levels of government.

Kris Tremaine is a Senior Vice President at ICF, leading the firm’s federal digital practice. She has 25 years of experience providing strategic and digital counsel to government, nonprofit, and private sector organizations.

Let’s block ads! (Why?)

Big Data – VentureBeat

Robots And AI In Retail: 8 Things You Must Know

Last August, a woman arrived at a Reno, Nevada, hospital and told the attending doctors that she had recently returned from an extended trip to India, where she had broken her right thighbone two years ago. The woman, who was in her 70s, had subsequently developed an infection in her thigh and hip for which she was hospitalized in India several times. The Reno doctors recognized that the infection was serious—and the visit to India, where antibiotic-resistant bacteria runs rampant, raised red flags.

When none of the 14 antibiotics the physicians used to treat the woman worked, they sent a sample of the bacterium to the U.S. Centers for Disease Control (CDC) for testing. The CDC confirmed the doctors’ worst fears: the woman had a class of microbe called carbapenem-resistant Enterobacteriaceae (CRE). Carbapenems are a powerful class of antibiotics used as last-resort treatment for multidrug-resistant infections. The CDC further found that, in this patient’s case, the pathogen was impervious to all 26 antibiotics approved by the U.S. Food and Drug Administration (FDA).

In other words, there was no cure.

This is just the latest alarming development signaling the end of the road for antibiotics as we know them. In September, the woman died from septic shock, in which an infection takes over and shuts down the body’s systems, according to the CDC’s Morbidity and Mortality Weekly Report.

Other antibiotic options, had they been available, might have saved the Nevada woman. But the solution to the larger problem won’t be a new drug. It will have to be an entirely new approach to the diagnosis of infectious disease, to the use of antibiotics, and to the monitoring of antimicrobial resistance (AMR)—all enabled by new technology.

sap Q217 digital double feature2 images2 Robots And AI In Retail: 8 Things You Must KnowBut that new technology is not being implemented fast enough to prevent what former CDC director Tom Frieden has nicknamed nightmare bacteria. And the nightmare is becoming scarier by the year. A 2014 British study calculated that 700,000 people die globally each year because of AMR. By 2050, the global cost of antibiotic resistance could grow to 10 million deaths and US$ 100 trillion a year, according to a 2014 estimate. And the rate of AMR is growing exponentially, thanks to the speed with which humans serving as hosts for these nasty bugs can move among healthcare facilities—or countries. In the United States, for example, CRE had been seen only in North Carolina in 2000; today it’s nationwide.

Abuse and overuse of antibiotics in healthcare and livestock production have enabled bacteria to both mutate and acquire resistant genes from other organisms, resulting in truly pan-drug resistant organisms. As ever-more powerful superbugs continue to proliferate, we are potentially facing the deadliest and most costly human-made catastrophe in modern times.

“Without urgent, coordinated action by many stakeholders, the world is headed for a post-antibiotic era, in which common infections and minor injuries which have been treatable for decades can once again kill,” said Dr. Keiji Fukuda, assistant director-general for health security for the World Health Organization (WHO).

Even if new antibiotics could solve the problem, there are obstacles to their development. For one thing, antibiotics have complex molecular structures, which slows the discovery process. Further, they aren’t terribly lucrative for pharmaceutical manufacturers: public health concerns call for new antimicrobials to be financially accessible to patients and used conservatively precisely because of the AMR issue, which reduces the financial incentives to create new compounds. The last entirely new class of antibiotic was introduced 30 year ago. Finally, bacteria will develop resistance to new antibiotics as well if we don’t adopt new approaches to using them.

Technology can play the lead role in heading off this disaster. Vast amounts of data from multiple sources are required for better decision making at all points in the process, from tracking or predicting antibiotic-resistant disease outbreaks to speeding the potential discovery of new antibiotic compounds. However, microbes will quickly adapt and resist new medications, too, if we don’t also employ systems that help doctors diagnose and treat infection in a more targeted and judicious way.

Indeed, digital tools can help in all four actions that the CDC recommends for combating AMR: preventing infections and their spread, tracking resistance patterns, improving antibiotic use, and developing new diagnostics and treatment.

Meanwhile, individuals who understand both the complexities of AMR and the value of technologies like machine learning, human-computer interaction (HCI), and mobile applications are working to develop and advocate for solutions that could save millions of lives.

sap Q217 digital double feature2 images3 1024x572 Robots And AI In Retail: 8 Things You Must Know

Keeping an Eye Out for Outbreaks

Like others who are leading the fight against AMR, Dr. Steven Solomon has no illusions about the difficulty of the challenge. “It is the single most complex problem in all of medicine and public health—far outpacing the complexity and the difficulty of any other problem that we face,” says Solomon, who is a global health consultant and former director of the CDC’s Office of Antimicrobial Resistance.

Solomon wants to take the battle against AMR beyond the laboratory. In his view, surveillance—tracking and analyzing various data on AMR—is critical, particularly given how quickly and widely it spreads. But surveillance efforts are currently fraught with shortcomings. The available data is fragmented and often not comparable. Hospitals fail to collect the representative samples necessary for surveillance analytics, collecting data only on those patients who experience resistance and not on those who get better. Laboratories use a wide variety of testing methods, and reporting is not always consistent or complete.

Surveillance can serve as an early warning system. But weaknesses in these systems have caused public health officials to consistently underestimate the impact of AMR in loss of lives and financial costs. That’s why improving surveillance must be a top priority, says Solomon, who previously served as chair of the U.S. Federal Interagency Task Force on AMR and has been tracking the advance of AMR since he joined the U.S. Public Health Service in 1981.

A Collaborative Diagnosis

Ineffective surveillance has also contributed to huge growth in the use of antibiotics when they aren’t warranted. Strong patient demand and financial incentives for prescribing physicians are blamed for antibiotics abuse in China. India has become the largest consumer of antibiotics on the planet, in part because they are prescribed or sold for diarrheal diseases and upper respiratory infections for which they have limited value. And many countries allow individuals to purchase antibiotics over the counter, exacerbating misuse and overuse.

In the United States, antibiotics are improperly prescribed 50% of the time, according to CDC estimates. One study of adult patients visiting U.S. doctors to treat respiratory problems found that more than two-thirds of antibiotics were prescribed for conditions that were not infections at all or for infections caused by viruses—for which an antibiotic would do nothing. That’s 27 million courses of antibiotics wasted a year—just for respiratory problems—in the United States alone.

And even in countries where there are national guidelines for prescribing antibiotics, those guidelines aren’t always followed. A study published in medical journal Family Practice showed that Swedish doctors, both those trained in Sweden and those trained abroad, inconsistently followed rules for prescribing antibiotics.

Solomon strongly believes that, worldwide, doctors need to expand their use of technology in their offices or at the bedside to guide them through a more rational approach to antibiotic use. Doctors have traditionally been reluctant to adopt digital technologies, but Solomon thinks that the AMR crisis could change that. New digital tools could help doctors and hospitals integrate guidelines for optimal antibiotic prescribing into their everyday treatment routines.

“Human-computer interactions are critical, as the amount of information available on antibiotic resistance far exceeds the ability of humans to process it,” says Solomon. “It offers the possibility of greatly enhancing the utility of computer-assisted physician order entry (CPOE), combined with clinical decision support.” Healthcare facilities could embed relevant information and protocols at the point of care, guiding the physician through diagnosis and prescription and, as a byproduct, facilitating the collection and reporting of antibiotic use.

sap Q217 digital double feature2 images4 Robots And AI In Retail: 8 Things You Must Know

Cincinnati Children’s Hospital’s antibiotic stewardship division has deployed a software program that gathers information from electronic medical records, order entries, computerized laboratory and pathology reports, and more. The system measures baseline antimicrobial use, dosing, duration, costs, and use patterns. It also analyzes bacteria and trends in their susceptibilities and helps with clinical decision making and prescription choices. The goal, says Dr. David Haslam, who heads the program, is to decrease the use of “big gun” super antibiotics in favor of more targeted treatment.

While this approach is not yet widespread, there is consensus that incorporating such clinical-decision support into electronic health records will help improve quality of care, contain costs, and reduce overtreatment in healthcare overall—not just in AMR. A 2013 randomized clinical trial finds that doctors who used decision-support tools were significantly less likely to order antibiotics than those in the control group and prescribed 50% fewer broad-spectrum antibiotics.

Putting mobile devices into doctors’ hands could also help them accept decision support, believes Solomon. Last summer, Scotland’s National Health Service developed an antimicrobial companion app to give practitioners nationwide mobile access to clinical guidance, as well as an audit tool to support boards in gathering data for local and national use.

“The immediacy and the consistency of the input to physicians at the time of ordering antibiotics may significantly help address the problem of overprescribing in ways that less-immediate interventions have failed to do,” Solomon says. In addition, handheld devices with so-called lab-on-a-chip  technology could be used to test clinical specimens at the bedside and transmit the data across cellular or satellite networks in areas where infrastructure is more limited.

Artificial intelligence (AI) and machine learning can also become invaluable technology collaborators to help doctors more precisely diagnose and treat infection. In such a system, “the physician and the AI program are really ‘co-prescribing,’” says Solomon. “The AI can handle so much more information than the physician and make recommendations that can incorporate more input on the type of infection, the patient’s physiologic status and history, and resistance patterns of recent isolates in that ward, in that hospital, and in the community.”

Speed Is Everything

Growing bacteria in a dish has never appealed to Dr. James Davis, a computational biologist with joint appointments at Argonne National Laboratory and the University of Chicago Computation Institute. The first of a growing breed of computational biologists, Davis chose a PhD advisor in 2004 who was steeped in bioinformatics technology “because you could see that things were starting to change,” he says. He was one of the first in his microbiology department to submit a completely “dry” dissertation—that is, one that was all digital with nothing grown in a lab.

Upon graduation, Davis wanted to see if it was possible to predict whether an organism would be susceptible or resistant to a given antibiotic, leading him to explore the potential of machine learning to predict AMR.

sap Q217 digital double feature2 images5 Robots And AI In Retail: 8 Things You Must Know

As the availability of cheap computing power has gone up and the cost of genome sequencing has gone down, it has become possible to sequence a pathogen sample in order to detect its AMR resistance mechanisms. This could allow doctors to identify the nature of an infection in minutes instead of hours or days, says Davis.

Davis is part of a team creating a giant database of bacterial genomes with AMR metadata for the Pathosystems Resource Integration Center (PATRIC), funded by the U.S. National Institute of Allergy and Infectious Diseases to collect data on priority pathogens, such as tuberculosis and gonorrhea.

Because the current inability to identify microbes quickly is one of the biggest roadblocks to making an accurate diagnosis, the team’s work is critically important. The standard method for identifying drug resistance is to take a sample from a wound, blood, or urine and expose the resident bacteria to various antibiotics. If the bacterial colony continues to divide and thrive despite the presence of a normally effective drug, it indicates resistance. The process typically takes between 16 and 20 hours, itself an inordinate amount of time in matters of life and death. For certain strains of antibiotic-resistant tuberculosis, though, such testing can take a week. While physicians are waiting for test results, they often prescribe broad-spectrum antibiotics or make a best guess about what drug will work based on their knowledge of what’s happening in their hospital, “and in the meantime, you either get better,” says Davis, “or you don’t.”

At PATRIC, researchers are using machine-learning classifiers to identify regions of the genome involved in antibiotic resistance that could form the foundation for a “laboratory free” process for predicting resistance. Being able to identify the genetic mechanisms of AMR and predict the behavior of bacterial pathogens without petri dishes could inform clinical decision making and improve reaction time. Thus far, the researchers have developed machine-learning classifiers for identifying antibiotic resistance in Acinetobacter baumannii (a big player in hospital-acquired infection), methicillin-resistant Staphylococcus aureus (a.k.a. MRSA, a worldwide problem), and Streptococcus pneumoniae (a leading cause of bacterial meningitis), with accuracies ranging from 88% to 99%.

Houston Methodist Hospital, which uses the PATRIC database, is researching multidrug-resistant bacteria, specifically MRSA. Not only does resistance increase the cost of care, but people with MRSA are 64% more likely to die than people with a nonresistant form of the infection, according to WHO. Houston Methodist is investigating the molecular genetic causes of drug resistance in MRSA in order to identify new treatment approaches and help develop novel antimicrobial agents.

sap Q217 digital double feature2 images6 1024x572 Robots And AI In Retail: 8 Things You Must Know

The Hunt for a New Class of Antibiotics

There are antibiotic-resistant bacteria, and then there’s Clostridium difficile—a.k.a. C. difficile—a bacterium that attacks the intestines even in young and healthy patients in hospitals after the use of antibiotics.

It is because of C. difficile that Dr. L. Clifford McDonald jumped into the AMR fight. The epidemiologist was finishing his work analyzing the spread of SARS in Toronto hospitals in 2004 when he turned his attention to C. difficile, convinced that the bacteria would become more common and more deadly. He was right, and today he’s at the forefront of treating the infection and preventing the spread of AMR as senior advisor for science and integrity in the CDC’s Division of Healthcare Quality Promotion. “[AMR] is an area that we’re funding heavily…insofar as the CDC budget can fund anything heavily,” says McDonald, whose group has awarded $ 14 million in contracts for innovative anti-AMR approaches.

Developing new antibiotics is a major part of the AMR battle. The majority of new antibiotics developed in recent years have been variations of existing drug classes. It’s been three decades since the last new class of antibiotics was introduced. Less than 5% of venture capital in pharmaceutical R&D is focused on antimicrobial development. A 2008 study found that less than 10% of the 167 antibiotics in development at the time had a new “mechanism of action” to deal with multidrug resistance. “The low-hanging fruit [of antibiotic development] has been picked,” noted a WHO report.

Researchers will have to dig much deeper to develop novel medicines. Machine learning could help drug developers sort through much larger data sets and go about the capital-intensive drug development process in a more prescriptive fashion, synthesizing those molecules most likely to have an impact.

McDonald believes that it will become easier to find new antibiotics if we gain a better understanding of the communities of bacteria living in each of us—as many as 1,000 different types of microbes live in our intestines, for example. Disruption to those microbial communities—our “microbiome”—can herald AMR. McDonald says that Big Data and machine learning will be needed to unlock our microbiomes, and that’s where much of the medical community’s investment is going.

He predicts that within five years, hospitals will take fecal samples or skin swabs and sequence the microorganisms in them as a kind of pulse check on antibiotic resistance. “Just doing the bioinformatics to sort out what’s there and the types of antibiotic resistance that might be in that microbiome is a Big Data challenge,” McDonald says. “The only way to make sense of it, going forward, will be advanced analytic techniques, which will no doubt include machine learning.”

Reducing Resistance on the Farm

Bringing information closer to where it’s needed could also help reduce agriculture’s contribution to the antibiotic resistance problem. Antibiotics are widely given to livestock to promote growth or prevent disease. In the United States, more kilograms of antibiotics are administered to animals than to people, according to data from the FDA.

One company has developed a rapid, on-farm diagnostics tool to provide livestock producers with more accurate disease detection to make more informed management and treatment decisions, which it says has demonstrated a 47% to 59% reduction in antibiotic usage. Such systems, combined with pressure or regulations to reduce antibiotic use in meat production, could also help turn the AMR tide.

sap Q217 digital double feature2 images7 1024x572 Robots And AI In Retail: 8 Things You Must Know

Breaking Down Data Silos Is the First Step

Adding to the complexity of the fight against AMR is the structure and culture of the global healthcare system itself. Historically, healthcare has been a siloed industry, notorious for its scattered approach focused on transactions rather than healthy outcomes or the true value of treatment. There’s no definitive data on the impact of AMR worldwide; the best we can do is infer estimates from the information that does exist.

The biggest issue is the availability of good data to share through mobile solutions, to drive HCI clinical-decision support tools, and to feed supercomputers and machine-learning platforms. “We have a fragmented healthcare delivery system and therefore we have fragmented information. Getting these sources of data all into one place and then enabling them all to talk to each other has been problematic,” McDonald says.

Collecting, integrating, and sharing AMR-related data on a national and ultimately global scale will be necessary to better understand the issue. HCI and mobile tools can help doctors, hospitals, and public health authorities collect more information while advanced analytics, machine learning, and in-memory computing can enable them to analyze that data in close to real time. As a result, we’ll better understand patterns of resistance from the bedside to the community and up to national and international levels, says Solomon. The good news is that new technology capabilities like AI and new potential streams of data are coming online as an era of data sharing in healthcare is beginning to dawn, adds McDonald.

The ideal goal is a digitally enabled virtuous cycle of information and treatment that could save millions of dollars, lives, and perhaps even civilization if we can get there. D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


About the Authors:

Dr. David Delaney is Chief Medical Officer for SAP.

Joseph Miles is Global Vice President, Life Sciences, for SAP.

Walt Ellenberger is Senior Director Business Development, Healthcare Transformation and Innovation, for SAP.

Saravana Chandran is Senior Director, Advanced Analytics, for SAP.

Stephanie Overby is an independent writer and editor focused on the intersection of business and technology.

Comments

Let’s block ads! (Why?)

Digitalist Magazine