Tag Archives: Modern

Unilab turns to NetSuite OneWorld for a modern, cloud-based system to transform B2B operations

og image Unilab turns to NetSuite OneWorld for a modern, cloud based system to transform B2B operations

Philippines’ Largest Pharmaceutical Company Improves Efficiency, Transparency in B2B Healthcare Distribution Channel

SAN MATEO, Calif., and MAKATI CITY, Philippines—November 1, 2017—Oracle NetSuite, one of the world’s leading providers of cloud-based financials / ERPHRProfessional Services Automation (PSA) and omnichannel commerce software suites, announced today that Unilab, the largest pharmaceutical company in the Philippines, has implemented NetSuite OneWorld to help power its 21 distributors by setting them up as individual business partners. Unilab upgraded from a 15-year-old locally developed application to a unified cloud ERP system, enabling its distributors to easily manage inventory and billing processes of around 10,000 trade accounts such as drug stores, clinics, and groceries. Unilab is also using OneWorld for sales and data consolidation and multi-subsidiary management. Since completing the NetSuite OneWorld implementation in January 2017, Unilab has streamlined operations for its distributors and has gained greater visibility into the channel, which accounts for a significant percentage of its US $ 1 billion annual revenue. Unilab is the first in the industry to leverage cloud ERP to standardize and stabilize its distributor management program through its project called iSERV 2.0.

Founded in 1945, Unilab manufactures over 350 brands of over-the-counter and prescription medications and personal health care products. The 4,000-person company, based in Mandaluyong in greater Manila, has maintained more than 20-percent market share in the Philippines for more than three decades. To help support continued growth and keep up with the changing times, Unilab needed to modernize from an on-premise system used by distributors to a flexible and scalable cloud-based system. Previously, Unilab’s business leaders had to manually consolidate and track data from distributors. Unilab realized it would need a new modern system that did not need to rely on servers scattered across the corporate landscape.

As part of its business continuity plan, Unilab also wanted to transition to the cloud as a disaster-protection measure that would stabilize the entire system during unexpected events like typhoons.

After evaluating several software options, Unilab selected NetSuite OneWorld as an agile, scalable cloud platform ideal to improve efficiency, visibility and standardization in the distribution channel. NetSuite Solution Provider CloudTech played a key role, successfully and seamlessly implementing NetSuite at Unilab’s distributors.

With NetSuite OneWorld, Unilab has been able to realize its goals of real-time data visibility, streamlined distribution process, simplified data consolidation, and strengthened compliance while providing disaster protection through its cloud-based architecture.

NetSuite OneWorld supports 190 currencies, 20 languages, automated tax calculation and reporting in more than 100 countries, and transactions in more than 200 countries.
With NetSuite OneWorld, Unilab has also realized the following benefits:

Channel efficiency and visibility. Today, distributors use NetSuite for transactions with its trade accounts such as managing the inventory and billing process. Distributors are also able to monitor accounts receivable, inventory status, order status, and credit limits in NetSuite, while Unilab can better track vital data in real time.

Improved compliance. NetSuite gives Unilab better inventory management with lot-tracking capabilities to support compliance of distributors with the FEFO (first expiration, first out) distribution.

Multi-subsidiary management. With OneWorld, Unilab is able to centrally manage each of its 21 distributors.

About Oracle NetSuite
Oracle NetSuite pioneered the Cloud Computing revolution in 1998, establishing the world’s first company dedicated to delivering business applications over the internet. Today, it provides a suite of cloud-based financials / Enterprise Resource Planning (ERP), HR and omnichannel commerce software that runs the business of companies in more than 100 countries. For more information, please visit http://www.netsuite.com.

Follow Oracle NetSuite Global Business Unit’s Cloud blog, Facebook page and @NetSuite Twitter handle for real-time updates.

About Oracle
The Oracle Cloud offers complete SaaS application suites for ERP, HCM and CX, plus best-in-class database Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) from data centers throughout the Americas, Europe and Asia. For more information about Oracle (NYSE:ORCL), please visit us at oracle.com.

Trademarks
Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Let’s block ads! (Why?)

NetSuite's Latest Press Coverage

The city of Kamianets-Podilskyi, in modern Ukraine, has the…

The city of
Kamianets-Podilskyi, in modern Ukraine, has the unique feature of being almost entirely surrounded by
Smotrych River.  This natural defense was supplemented by successive fortresses, yet still the town changed hands numerous times in its thousand year history.

Let’s block ads! (Why?)

A Historian Walks into a Bar . . .

Announcing the Modern Servicing Model for SQL Server

Background to SQL Server servicing

Historically, we have released Cumulative Updates (CUs) every 2 months after a major version is released, and roughly yearly Service Packs (SPs), containing fixes from all previous CUs, plus any feature completeness or supportability enhancements that may require localization. You can read more about the SQL Server Incremental Servicing Model (ISM) here.

Up to and including SQL Server 2016, RTM and any subsequent SPs establish a new product baseline. For each new baseline, CUs are provided for roughly 12 months after the next SP releases, or at the end of the mainstream phase of product lifecycle, whichever comes first.

For the entire product lifecycle, we release General Distribution Releases (GDRs) when needed, containing only security related fixes.

The Modern Servicing Model

Starting with SQL Server 2017, we are adopting a simplified, predictable mainstream servicing lifecycle:

  • SPs will no longer be made available. Only CUs, and GDRs when needed.
  • CUs will now accommodate localized content, allowing new feature completeness and supportability enhancements to be delivered faster.
  • CUs will be delivered more often at first and then less frequently. Every month for the first 12 months, and every quarter for the remainder 4 years of the full 5-year mainstream lifecycle.
  • CUs are delivered on the same week of the month: week of 3rd Tuesday.

Note: the Modern Servicing Model (MSM) only applies to SQL Server 2017 and future versions.

Servicing lifecycle

The servicing lifecycle is unchanged from SQL Server 2016:

  • Years 0-5 (Mainstream Support): Security and Functional issue resolution though CUs. Security issues through GDRs.
  • Years 6-10 (Extended Support): Security or critical functional issues.
  • Years 11-16 (Premium Assurance): Optional paid extension to Extended Support (no scope change).

FAQ

Having questions is expected. Please read below in case we have already covered it in this FAQ.

Q1: SPs were fully localized, and you released one update file for every supported language. How will this be handled with no SPs?
A1: CUs will be localized starting with SQL 2017. CUs will handle this requirement maintaining a single update file.

Q2: When we upgraded from a previous version of SQL Server, we did so at SP1 using slipstream media provided by Microsoft. How will this work with no SPs?
A2: We will provide CU based slipstream media for CU12 allowing for this.

Q3: My company always waited for SP1 to perform an upgrade from a previous version. What are my options now?
A3: Even before GA, the final SQL Server 2016 CTP versions were considered production-ready having gone through exhaustive testing both internally and with many preview customers. So there is no need to wait for an SP to install the latest SQL Server – you can install confidently as soon as a given version goes GA.
With that, you can still target any CU for Upgrade. For example, you could target CU12 for upgrade, and have slipstream media available.

Q4: I service an instance only with GDRs. I do not apply CUs, but apply SPs. Will I need to move to a CU servicing train if I need a non-critical/security fix?
A4: Yes. While this was previously true only leading up to SPs, now you must apply latest CU and there will not be an opportunity to reset back to receiving GDR updates only.

Q5: Assume that after Mainstream Support, you release a security fix. Are these going to be GDRs only? If so, how can I install it, if I’m already on a CU servicing train?
A5: During Extended Support, we will release GDRs and GDR-CUs separately. The same is valid for customers that purchase the additional Premium Assurance.

Q6: Previously, once SP2 was released (for example), if I was on the RTM baseline I would have to upgrade to SP1 or SP2 to get a hotfix. How will this work now?
A6: The only baseline will be RTM, and it will receive CUs for 5 years. There are no upgrades to an SP to receive CUs, or worry about which baseline a CU applies to.

Q7: If I am on RTM baseline, and CU20 (for example) was just released, will I receive technical support?
A7: This may be handled on a case by case basis. If the issue/question is in an area that has received a significant number of updates throughout the years, you may be asked to update to a later CU, yes.

Q8: Will SQL Server on Linux receive CUs and GDRs as well?
A8: Yes, every CU and GDR will have corresponding updates to all current Linux platforms.

Q9: Will CU and GDR KB articles then cover both SQL Server on Windows and Linux?
A9: Yes. Issues addressed in each release will be categorized by impacted platform(s).

Q10: Will SQL Server for Linux CUs and GDRs be updates to an existing installation like SQL Server on Windows?
A10: No, SQL Server on Linux updates will completely replace all binaries in the existing installation.

Q11: On SQL Server on Linux, can I remove an update?
A11: Yes, however this operation is performed by re-installing any desired previous servicing level package.

Q12: Will the testing and resulting quality levels of CUs be the same as SPs?
A12: Yes. CUs for all versions of SQL Server are tested to the same levels of Service Packs. As announced in January 2016, you should plan to install a CU with the same level of confidence you plan to install SPs as they are released. You can read more about that here.

Q13: Monthly CU releases are fast, I do not believe my business can keep pace with this, yet you have been proactively encouraging customers to stay current.
A13: Yes, the cadence is fast for the first 12 months. However, payload will be roughly 50% in theory, so these should be easier to consume. Of course, you still have the option to install every other CU for example, for the first 12 months. As the name suggests, all CUs are cumulative.

Q14: Why release CUs every month only for the first year, then move to quarterly updates for the remaining 4 years?
A14: Data shows that the vast majority of all hotfixes issued for a major release occurs in the first 12 months. The monthly cadence brings these fixes to customers much faster when it has the most impact. Reducing to quarterly updates reduces customer and operational overhead the course of the remaining 4 years.

Q15: Will the availability of CUs remain unchanged?
A15: For SQL Server on Windows CUs, no changes are planned. The most recent CU will be available on the Download Center, Windows Catalog, and WSUS. Previous CUs will be available in the Windows Catalog.

Q16: Where will I look for SQL Server on Linux CUs and GDRs?
A16: All updates, current and previous, will be maintained and available in repositories.

Q17: I see that Reporting Services (SSRS) is no longer installed by Setup. Where is it and how will it be serviced?
A17: RS is available for download via a link in Setup. Servicing will be independent moving forward.

Let’s block ads! (Why?)

SQL Server Release Services

An Inside-Out Approach to ERP can Deliver a Modern Customer Experience

websitelogo An Inside Out Approach to ERP can Deliver a Modern Customer Experience

Posted by Anand Misra, Principal Product Marketing Manager

For businesses today still running siloed, departmental solutions of yesterday, there are daily challenges meeting the needs of the modern consumer. The answer for many is to turn their ERP system inside out.

Consumers today have virtually unlimited options for researching and purchasing products, with online sales and new digital channels providing not only transparency into pricing but the actual shopping experience for millions of shoppers around the world. That ubiquity of information has raised expectations and most businesses are having a hard time delivering on.

A chief culprit for the challenge in delivering an omnichannel customer experience is the ERP system itself. Traditionally, ERP software was built to serve the needs of employees, not today’s consumers, let alone the partners and vendors that are a critical part of any modern enterprise. Today’s customers expect accurate inventory information, cross-channel order history and flawless order execution. These are inherently difficult for the legacy ERP systems from the ‘90s that too many businesses are still running on today. Those systems were designed around departmental processes, rather than around the customer and many of the newer versions of those older systems struggle to cast of the legacy of their origins. When the internet emerged with new platforms to transform the way companies deliver product support and information to customers, most companies just began bolting on ecommerce and content management systems that were disconnected from the system of record. To this day, customer data is still spread across CRM, ecommerce, marketing and multiple systems of record, making it near impossible to reward the most profitable customers, predict demand or ensure repeat business.

The answer for many is to spend hundreds of thousands of dollars trying to integrate these separate systems to support their omnichannel ambitions. The results are mixed, with integrations breaking with software upgrades, a lack of real-time visibility as data transfers are done in batches and companies still left with software designed to support employees rather than customers.

These companies fail to realize the depth to which the need to redesign their core infrastructure. Every aspect of this infrastructure needs to be evaluated to design around a customer-centric model from the beginning. They need to turn the ERP system inside out with the explicit goal of improving the customer experience.

Companies that orient around their customers and directly connect demand to a digitally-enabled supple chain will be the long-term winners. Amazon, the prime example, has built its infrastructure to take advantage of global product and price transparency, even dynamically pricing versus competitors.

Today, a company no longer needs to physically own a product to sell it on its website. If a retailer knows a vendor has inventory, it can take the order without ever possessing the product. And beyond supply chain efficiencies, there was incredible efficiency from operating at scale. As the businesses grow, the incremental costs associated with demand could be handled with far fewer employees and far lower inventory costs.

Newer companies that built (or rebuilt) from the ground up with ecommerce and a digital supply chain reap significant advantages:

  • Visibility into supplier and manufacturer inventory.
  • Responsive, consistently excellent customer service.
  • The ability to track and evaluate customer buying histories, behaviors and preferences.
  • Customer profiling and product recommendations for better targeting.
  • Customer self-service through low-cost online portals.

What started as a challenge for B2C companies is now manifesting in the B2B world. B2B customers that have seen the ease of use, visibility and real-time information provided in the B2C world, couple with a new generation of employees that knows no other way, are forcing B2B companies to reimagine their own processes.

Delivering the best customer experiences requires wholesale changes: in organizational structure, in culture and in IT systems. It requires a more modern infrastructure built around the customer. A modern infrastructure is an investment that will pay off in the years and decades to come. But finally, the ultimate goal is within reach: give customers a personalized, relevant and consistent experience across every channel.

For more on the power of building ERP around the customer, download the white paper Customer Commerce: Turning Your ERP Inside Out.

Learn how NetSuite helps create ubiquitous customer experience, helps differentiate your brand and exceed customer expectations: www.suitecommerce.com

Posted on Wed, August 16, 2017
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

Measurement, Accountability, And Alignment: The Keys To Modern Marketing

”That which is measured, improves.”

This is commonly attributed to either Karl Pearson, a famous statistician, or Peter Drucker, a well-known management consultant. I learned about this years ago studying how Roger Bannister broke the 4-minute mile barrier in running, which seemed insurmountable. Likewise, accountability and alignment are key to achieving success, and became a guiding principle while coaching youth ice hockey years ago. I would show how one player skating the puck along the entire length of the ice took far longer than a succession of quick passes between players working together to move the puck up the ice. Understanding the importance of aligning efforts empowered the team to play as one unified unit with a common goal and reward. And every player knew their role and was accountable to specific roles.

The same applies to marketing and specifically business-to-business (B2B) marketing, which is composed of a marketing plan, various tactics, business goals, and measurement of sales growth within a target market. Everyone in the marketing organization must be accountable for achieving a common set of goals, results, and expected outcomes in everything they do. Plus, this performance must be measured in terms of growth and gained efficiencies. With this combination of accountability and measurement, the path to success is clear.

Alignment and accountability ignite business growth and profitability

Recently, these themes were the focus of a briefing and networking event held by SiriusDecisions down the road from my office outside of Boston. Focused on how B2B organizations need a combination of alignment and accountability, supported by data, to reach goals, the day was kicked off by Megan Heurer, vice president of research at SiriusDecisions, who expertly articulated the value of an aligned team, person, role, or organization.

Fostering a vital connection to the goals of their team, role, and company, employees can work with greater efficiency as they understand which activities are redundant and do not add value to the business and customer experience. Transparency is critical to report progress towards goals based on specific data and measurements. Plus, accountability helps employees know what they need to specifically do and how their actions advance defined goals by measuring performance, progress, outcomes.

In this case, top-line benefits of combining accountability and alignment are real and carry significant weight. Megan cited SiriusDecisions research that revealed alignment and accountability drives 19% faster revenue growth and 15% higher profitability.

All of this is made possible by advancing the organization with data to help all parties gain clarity and focus. We call this insight-driven marketing, and it’s a guiding principle of my organization’s brand of modern marketing. The availability of data and underlying insights support the ability to regularly compare key performance indicators (KPIs) and report those findings.

But for this information to be truly insightful, measurements must go beyond typical sales and marketing metrics such as counting content assets, products delivered, customer successes, channel growth, and partner engagement. Rather like my efforts to measure the playtime of the puck, the wisdom “that which is measured, improves” reminds us that investments in data and performance management will pay off with considerable benefits.

Metric Spectrum provides the structure necessary to secure success

Furthering its reputation for creating breakthrough marketing frameworks and models, SiriusDecisions featured its Metrics Spectrum, which helps organize and structure the journey to insight-driven decision making. When I first learned about this new framework, it was easy to see how relevant to optimizing team performance. Like a coach devising a game plan to help a hockey team succeed, team leaders are called to choose the right metrics to track actions, milestones, and goal attainment.

SiriusDecisions’ Metrics Spectrum consists of four primary components:SD Metrics 1 Measurement, Accountability, And Alignment: The Keys To Modern Marketing

  1. Readiness – Determine how well the team is prepared to perform by analyzing data, skills, and business processes. This assessment provides a solid foundation for any strategic plan or business case.
  1. Activity – Adopt an activity-based metrics approach by counting the tactics and actions taken, from outbound calls from telemarketing and inside sales to e-mails sent.
  1. Output – Measure the direct result of business actions and activity—from responders, inquiries, and inbound requests for more information and next steps—and go beyond purely activity-based metrics.
  1. Impact – Assess how the team performed against defined goals—revenue, profitability, and ultimately, market share. Too many organizations fail to do this because they are mired in the details of activities. But when this is done (and done well), the team provides the transparency to demonstrate its value to the executive leadership who is signing paychecks.

The backdrop for a successful execution is defined by an organization’s objectives, quantifiable impact on goals, specific milestones to prove progress, and an inventory of actions and tactics. By supporting organizational alignment and accountability with increased governance and availability of external data, teams can gain a thorough understanding of its impact on the entire bSD metrics 2 Measurement, Accountability, And Alignment: The Keys To Modern Marketingusiness, which is key to embracing modern marketing.

Bruce Brien, the chief technology officer of Sirius Decisions, showcased the SiriusDecisions Command Center a new offering announced at the SiriusDecisions 2017 Summit. As I am a lifelong fan and provider of marketing dashboards, this was music to my ears. It’s based upon the SiriusDecisions Metrics Spectrum, and organizations can leverage predefined metrics available with benchmarking against peers, etc. with a graphical view for decisions makers, a key for successful insight-driven marketing.

I hope you decide to apply the principles of “that which is measured, improves” across your accountable and aligned teams to fully realize significant benefits and payoff. Please review my trip report from this event to learn more.

Fred is the senior marketing director of SAP HANA Enterprise Cloud and SAP Digital Business Services Marketing at SAP. Join Fred online: TwitterFacebookLinkedInsap.com

Comments

Let’s block ads! (Why?)

Digitalist Magazine

Lead, Transform, Operate: A Three-Step Imperative for the Modern CFO

websitelogo Lead, Transform, Operate: A Three Step Imperative for the Modern CFO

Posted by Tom Kelly, Senior Director of Product Marketing

CFOs today face several ever-evolving challenges. Digital disruption continues to change the way business gets done everywhere from self-driving cars to closing the books. Additionally, businesses embracing hybrid business models that are bridging the internet-brick and mortar gap need to experiment and innovate without spending significant amounts of capital and resources. Meanwhile, globalization demands that companies comply with local laws and accounting requirements while simultaneously managing operations in each country. And, as if all of this is not enough, regulatory burdens continue to grow and the CFO still needs to hit the quarterly numbers.

In order to be successful in this ever changing, chaotic business environment it is essential that a CFO:

  • Lead by Focusing Resources on What Matters Most
  • Transform Proactively and Reactively
  • Operate the Finance Function Effectively and Efficiently

Lead by Focusing Resources on What Matters Most

No one is better positioned to help the organization concentrate on priorities. It is the CFO that leads the planning process, establishing the goals and how funds will be invested each year. Business acumen and expertise are important skills for a CFO to be able to focus the organization on key deliverables, but just as importantly the CFO needs a scalable infrastructure that can provide a single source of truth and actionable business intelligence. Combining business acumen and expertise with NetSuite’s unified data model, the CFO can deliver on actionable business intelligence across the entire enterprise by putting real-time data in the hands of the decision makers, on any device, at any time.

If the new model is successful, and the business grows continuously, the CFO needs to be wary of interrupting business momentum when the need to scale demands a new system and infrastructure. That’s where cloud platforms can future proof the business to scale, adapt and evolve. More importantly, multi-tenant cloud platforms ensure that customizations are carried forward and innovation is always current.

Transform Proactively and Reactively

The best way a CFO can make sure that the organization thrives is to put in place a flexible infrastructure that can react to an ever-changing environment. Today’s CFOs need a platform that does not compromise between scalability and control. As competitors innovate offering new products and/or services and new business models arise CFOs will need not only the functionality to deal with emerging challenges such as recurring billing but also a flexible platform to test and trial on newly emerging business models. In retail for example, businesses need a platform to test and configure a seamless shopping experience whether the customer is shopping online from a desktop or mobile device, or in a bricks and mortar store.

As business goes global, the CFO has to facilitate the transformation of multi-currency, multi-lingual, multi-book and different statutory requirements with efficiency. Failure to do so can result in degrading a company’s operational consistency, controls and visibility into foreign operations. Companies that have very large, rigid headquarters systems that cannot be quickly deployed forgoes the effort and the result is less visibility, less operational consistency and control. In these situations, the goal of maintaining one version of the truth is unobtainable.

Operate the Finance Function Effectively and Efficiently

While leading and transforming, the CFO has another thing to do – run the finance function in an effective an efficient manner. In some cases, this can be more demanding than anything else a CFO does, but it must be done, and done very well. With the ever-increasing amount of regulations, tax laws that are local, state, national and international, maintaining strong internal controls and corporate governance, and reporting information that is accurate, actionable and readily available can be enough of a challenge by themselves. Embracing the right software platform can make managing this menagerie simple.

Today’s CFOs should look at software companies that have considerable experience with organizations across industries, that can use that experience to ensure that best practices for things like corporate governance and controls are baked into the standard offerings. It is also important to get the information into the hands of decision makers in a timely and accurate manner. The standard approach of generating reports is (often through Excel spreadsheets) is no longer sufficient. CFOs need a system that puts actionable business intelligence and alerts in the hands of decision makers in real-time via mobile devices.

Are You an Old Version CFO?

Gone are the days of the Bean Counter! New regulations, new business models, the advent of big data, and exponential technology improvements requires that the CFO handle broader responsibilities beyond effectively and efficiently operating the finance function. Financial acumen is a must for today’s CFOs, but they must also possess strategic skills to help transform the organization and be able to have operational insight to lead. Using the old heavy rigid ERP offerings will not provide the tools to be nimble and break out of the “Old CFO’ mold.

Learn more about how NetSuite makes a CFO’s job easier, more strategic.

Posted on Thu, July 20, 2017
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

The Modern Marketer’s New Executive Dashboard with Matt Heinz

Creating a Profit Center Marketing Department – An Interview with Matt Heinz  351x200 The Modern Marketer’s New Executive Dashboard with Matt Heinz

This transcript has been edited for length. To get the full measure, listen to the podcast.

Serving the Business

Michelle Huff:

You can get really myopic sometimes on some common metrics that marketing tends to get measured on and just stop it there, when, at the end of the day, you want to be able to translate your impact into the numbers that the CFO and the board care about.

Matt Heinz:

A couple weeks ago I sat across from a marketer at a clients, and we were talking about just overall improving the funnel and making the funnel more efficient from a business output standpoint. And he literally sat across from the table and we were talking about different things to be creative and more focused. And he said, ‘I’m not willing to do anything that’s going to increase my cost per lead.’

Michelle:

Really?

Matt:

Yes. He said that his job is to get a lower cost per lead, not a higher cost per lead. And look, they’re a fairly new client ‒ you’ve got to be careful. But what I wanted to do is reach out and just say, ‘Dude, your job is to close deals, right?’ And I get operationally in the right context. If you identify that Facebook is a great place to generate leads, and you spend 100 bucks to get 12 leads, I’m going to ask you how you can spend 100 bucks to get 18 leads. I’m going to ask you that question. But your job is not to get more leads from your marketing budget. Your job is to get more deals for the company. And it may turn the economics that you’re used to on their head. But you can’t be that myopic as a marketer. That is the path to irrelevance.

Customer Lifetime Value

Michelle:

Yes. Exactly. I could not agree with you more. In addition to that, some of the things we’ve been talking a lot about as well … you brought up customer lifetime value. And we talk a lot about brand, demand, and expand. And it’s really trying to think about marketing and about the full impact that marketing has on an organization. It’s a lot about the brand and getting awareness, and really helping brand the whole company.

How do you make sure they realize the value of your service, of your products? Get them to continue to buy more, to renew, expand their relationship and their purchasing from you. And a lot of times the whole role of customer marketing is very different, especially in B2B. Sometimes you don’t have anyone in the room. Sometimes all they do is references. How do you see customer lifetime value, its evolution with marketing, and the customer marketing role?

Matt:

I think a lot of companies just sort of have gone a little too far with the idea that, well, you can’t keep a customer unless you acquire him in the first place … which is true. And in any business, the starting point is acquisition, get him in the boat. But we all know that not all customers are equal. Some customers are expensive customers. Some customers aren’t a good fit with our product or service. And if they churn and then are unsatisfied, but they’re unsatisfied because they shouldn’t have been in the product in the first place, the market doesn’t care. It’s just going to be muddy with bad stories and bad karma about your business.

I think everybody in the business has to be thinking about the long-term health of the business, which is not just acquisition, it’s retention. What are you doing to drive not only getting those initial sales, but making those customers so phenomenally happy and successful that they stick around forever, and they tell all their friends, and all their peers, and all their colleagues about you as well? And that is Marketing 101. This goes back to our MBA days. You look at the economics ‒ it’s just so much more efficient to keep your best customers and to make them happy. And now to do that, well, talk about sales and marketing working together, now we’ve got a much bigger dance. We need sales and marketing, but we also need customer service, product marketing, and product development. This is where the entire company has to orient around that customer satisfaction success.

I think in the same way we’re talking about account-based marketing or account-based revenue, thinking that’s an opportunity for marketing to really drive a stronger, more strategic, more integrated focus on the right acquisition … What you’re talking about in terms of overall lifetime value and lifecycle marketing, is also marketing’s opportunity to own a much bigger place at the table. We are thankfully living in a world where there is a very clear path from heads of marketing, CMO, to CEO. That wasn’t always the case. We’ve come from a place where marketing in B2B has been thought of as the arts and crafts department. And boards aren’t putting the arts and crafts people in charge of the company.

But when you’ve got marketers that are now growing up in a marketing department and thinking about the overall value of the business, thinking and talking about and measuring things that the business cares about, now you’ve got a business leader. If you’re a CMO or if you’re new in marketing and if you’re starting at the bottom ‒ and we all did ‒ your opportunity is to not just change what you’re doing, but to change the way you think, change the way you prioritize, change the way you talk. Go spend some time with your CFO and find out the way they talk, find out the words they use and the things they care about. I mean, even if you don’t have an MBA, or if you didn’t go to business school, I don’t care. I didn’t do either of those things either.

But it became really important for me and I became a much better marketer when I started to understand what lifetime value meant, when I understood why margin was important, when I understood the impact of things I could do in marketing and how that would decrease acquisition cost, and increase lifetime value, and the impact that has, especially in an SaaS business, when it starts to really impact the flywheel of business growth and health. You can use those words and you’re still going to have to go back and increase open rates and get more retweets. I’m not saying those things aren’t building blocks. … You’ve still got to do the work. You’ve still got to do the marketing. The marketing itself may not fundamentally change. But how you prioritize it, how you do it, how you report on why it’s important ‒ that is fundamentally different.

The Marketer’s Executive Dashboard

Michelle:

What is your top list of things you need to do to start thinking about it from a reporting and measuring standpoint?

Matt:

I don’t think about it as sales versus marketing. The way I start to think about that now, both for my business, as we grow, as well as with clients, is I kind of erased that line to say, OK, starting as an acquisition unit, what do I care about? Well, I care about closed deals. I care about: How many of my target accounts am I closing? Whether you’ve named those accounts or whether you say, ‘I’m looking for companies that have these six attributes, and I want to close as many of those as possible’ … because I know that if they have these six attributes, they’re far more likely to see those long lifetime-value prospects.

There are closed deals, there are target account closed deals, and then there are some metrics behind that around pipeline created. In my business, I have two metrics around pipeline: I look at qualified opportunities created and I also look at a quarterly price opportunity metric. Because once I get an opportunity priced, based on our process, I know what my conversion rate is on those priced opportunities. And it’s been pretty consistent for a couple years. Those numbers I can put in front of our sales effort and say, ‘All right, if we get to that number, then I know that X percent’s going to close, and I know that that’s going to help us hit our number for the next quarter,’ or whatever I’m looking at.

We’re doing content. We’ve got a blog, we’ve got a podcast, we’re doing our campaigns. We’ve got metrics and ways that we’re managing operational efforts. But my marketing metrics don’t have anything to do with those. On my scorecard, it is closed deals, target accounts closed, opportunities created, and priced opportunities created per quarter. And that’s it. That’s what it rolls up to. And I figure if those are metrics that if I was the CFO, if I was the CEO, that’s the scorecard I care about. I think from a marketer’s standpoint that’s a good starting point. Again, separate your operational dashboard from your executive dashboard. Make sure you’re managing them both actively, but make sure you know your audience and what they need to hear and what they care about to prove your worth.

Let’s block ads! (Why?)

Act-On Blog

Microsoft accelerates modern BI adoption with Power BI Premium

Power BI was first made generally available in July 2015. Since then, Microsoft’s driving vision for Power BI has been to enable users across roles, disciplines and industries to sign up for the service in seconds and get business value by drawing insights from their data within minutes. Our relentless focus to drive access to insights at scale has helped Power BI reach more than 200,000 organizations, and the breadth of this global community continues to directly contribute to Power BI’s evolution – to date more than 400 features have been added to the service as the result of input from more than 50,000 users.

Introducing Power BI Premium

Today Microsoft is taking the next step in its commitment to empower people and organizations with access to critical intelligence: introducing Power BI Premium. Power BI Premium builds on the existing Power BI portfolio with a capacity-based licensing model that increases flexibility for how users access, share and distribute content. The new offering also delivers additional scalability and performance to the Power BI service.

  • Flexibility to license by capacity. Power BI Premium introduces expanded licensing flexibility to help organizations equip users with the appropriate level of access to the Power BI service based on their unique needs. For example, many organizations contain users who aren’t actively creating BI content, but require the ability to consume content distributed to them. Power BI Premium enables Power BI Pro users to publish reports broadly across the enterprise and beyond, without requiring recipients to be licensed per user.

  • Greater scale and performance. Organizations using Power BI Premium will be able to customize performance based on the needs of their team, department or the organization itself. The offering consists of capacity in the Power BI service exclusively allocated to each organization and supported by dedicated hardware fully managed by Microsoft. Organizations can choose to apply their dedicated capacity broadly, or allocate it to assigned workspaces based on the number of users, workload needs or other factors—and scale up or down as requirements change. A calculator is available to help with capacity planning.

  • Power BI apps. Along with the freedom to license Power BI for enterprise deployments, we are evolving content packs into Power BI apps to improve how users discover and explore insights at enterprise scale. Available today, Power BI apps offer a simplified way of deploying dashboards and reports to specific people, groups or an entire organization. Business users can easily install these apps and navigate them with ease, centralizing content in one place and updating automatically.

  • Extending on-premises capabilities. Power BI Premium introduces the ability to maintain BI assets on-premises with Power BI Report Server. Power BI Report Server is an on-premises server that allows the deployment and distribution of interactive Power BI reports – and traditional paginated reports – completely within the boundaries of the organization’s firewall. With Power BI Premium the same number of virtual cores an organization provisions in the cloud can also be deployed on-premises through Power BI Report Server, without the need to split the capacity. Organizations can choose Power BI in the cloud, or elect to keep reports on-premises with Power BI Report Server and move to the cloud at their pace.

  • Embedded analytics. With Power BI Premium we’re also advancing how Power BI content is embedded in apps created by customers, partners and the broad developer community. As part of the new offering we are converging Power BI Embedded with the Power BI service to deliver one API surface, a consistent set of capabilities and access to the latest features. Moving forward we encourage those interested in embedding Power BI in their apps to start with Power BI Desktop and move to deployment with Power BI Premium. Existing apps built on Power BI Embedded will continue to be supported.

Simplifying the free Power BI service

Just as Power BI Premium simplifies large-scale BI deployments, today we’re also simplifying the distinction between Power BI Pro and the free service. While the free service is intended for personal use and Power BI Pro enables collaboration, we’ve received feedback that functional differences between them have created confusion for users. Going forward, we will improve the free service to have the same functionality as Power BI Pro, but will limit sharing and collaboration features to only Power BI Pro users. Users of the free Power BI service will benefit from access to all data sources, increased workspace storage limits, and higher refresh and streaming rates. These changes will be effective June 1, and you can read more on the Power BI Community. Power BI Desktop continues to be available for free.

Learn More

Power BI Premium will be generally available late in the second quarter of 2017. Initial SKU and pricing information is availablehere.

  • Register for the Microsoft Data Insights Summit in Seattle on June 12–13, where we will have dedicated sessions on today’s announcements.

  • Register for our upcoming webinar on Power BI Premium.

  • Sign up to receive updates when Power BI Premium is available to purchase.

  • Read about the Power BI apps preview.

  • Learn about migration for existing Power BI Embedded customers.

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Kilwa Kisiwani, in modern Tanzania, was the center of the…

Kilwa Kisiwani, in modern Tanzania, was the center of the medieval
Kilwa Sultanate.  It thrived due to trade, like many of its Swahili Coast counterparts, mainly with the Arabian Peninsula. 


  1. avatar 26784a3e43be 16 Kilwa Kisiwani, in modern Tanzania, was the center of the...tumbleweedman420 reblogged this from colleenrants
  2. avatar 6bf2b73c3716 16 Kilwa Kisiwani, in modern Tanzania, was the center of the...colleenrants posted this

Let’s block ads! (Why?)

A Historian Walks into a Bar . . .

Defining the Components of a Modern Data Warehouse

As I put together a new presentation on my current favorite topic (modern data warehousing), it occurred to me that others might feel like there’s some confusion and/or overlap with terminology. Some terms are somewhat fuzzy and mean different things within different organizations, so here’s my best effort at a glossary of the components within a Modern Data Warehouse.

In alphabetical order:

Advanced Analytics

The term advanced analytics is a broad phrase for sophisticated statistical techniques to find patterns in the data for the purpose of predictions, recommendations, optimizations, and descriptions of information. It can include subcategories such as predictive analytics, prescriptive analytics, operational analytics, descriptive analytics, and so forth. Advanced analytics can be employed for use cases such as fraud detection, customer segmentation, assessing credit risk, or predictions such as student dropouts, customer churn, or hospital readmissions.

Analytics

In general, use of the term analytics has evolved over time. It used to imply the usage of statistical methods, such as machine learning and data mining. More recently the term analytics has evolved to be commonly used to describe finding meaningful patterns in data.

Analytics Sandbox

An analytics sandbox is an exploratory environment which a knowledgeable analyst or data scientist controls. In this ungoverned (or less governed) personal environment, an analyst can move very quickly with usage of preferred tools and techniques. The advantage of an analytics sandbox is agility for purposes of prototyping and/or data exploration. Sandbox solutions may be productionized and moved into the governed DW/BI/analytics environment.

Big Data

The term ‘big data’ is terribly overused, and used in different ways. One meaning refers to sheer size of data volumes. Another meaning is associated with multi-structured data (the combination of unstructured, semi-structured, and structured) of varying data types. Further, others mean it to imply analysis of data in new and interesting ways. Low latency data (the velocity portion of the 3 V’s of volume, variety, and velocity) is also attributed to big data.

Bimodal BI

Originally coined by Gartner, Bimodal BI refers to two modes for development and delivery of information. One mode is focused on agility: business-driven, rapid delivery which values the freedom of exploration and speed. The second mode focuses on traditional IT-driven processes which value reliability, governance, standardization, and security. Having groups of knowledgeable staff each devoted to their own mode, while working collaboratively with each other, is optimal for achieving a good balance around delivery of BI. 

Data Catalog

An enterprise data catalog is a repository which describes corporate data assets. One aspect of the data catalog is documentation which is often referred to as a data dictionary, but can be much more than that depending on the software vendor’s implementation. The ability to search for data based on keywords is another major benefit. In a search scenario, the underlying data security is still present; what is exposed is metadata such as columns, owner, and meaning of the data. A data catalog should also assist with data provisioning (i.e., where to request access to a particular data source if a user does not currently have access).

Data Governance

Data governance processes oversee corporate data assets so they are properly utilized with integrity and the data is understood as intended. Implementation of data governance varies wildly from organization to organization.

Data Federation

Frequently data federation is used synonymously with data virtualization, and the concepts are certainly related. Technically speaking, however, a federated query returns data from multiple data stores – federating, or combining, the query results. A federated query works using data virtualization techniques.

Data Integration

The term data integration can be used in a couple of ways. Conceptually, it refers to the integration, or consolidation, of data from multiple sources together. However, usually when we refer to data integration for a data warehouse, we’re referring to data integration being performed physically, via an ETL (Extract>Transform>Load) process. Oftentimes data virtualization (in which the consolidated data is not materialized) is contrasted with data integration (in which the consolidated data is materialized). Data integration processes typically involve significant development effort, which is one reason why data virtualization has become more prevalent.

Data Lake

A data lake is one piece of an overall data management strategy. Conceptually, a data lake is nothing more than a data repository. The data lake can store any type of data, so it’s well-suited to ingestion of multi-structured data such as logs and machinery output. Cost and effort associated with data ingestion are reduced because the data is stored in its original native format with no structure (schema) required of it initially. Data lakes usually align with an “ELT” strategy which means we can Extract and Load into the data lake in its original format, then Transform later *if* a need presents itself. A data lake is commonly implemented with HDFS (Hadoop Distributed File System), but could also easy involve other technologies such as NoSQL.

Data Mart

A data mart is usually focused on reporting and analysis for a particular department or a particular subject area. Data marts typically are considered to be a subset of what is contained in an enterprise data warehouse, though that is not always consistent for all implementations. Like a data warehouse, data marts are usually populated from source systems with an ETL process (data integration), so some latency is expected for the data to become available for reporting and analysis in the marts. It’s also possible for data marts to employ some level of data virtualization.

Data Virtualization

A data virtualization layer is one aspect of a logical data warehouse implementation. It can be thought of as a method to access one or more underlying data sources, regardless of where the data resides, without requiring the data to be physically materialized in another data structure. Depending on the data virtualization platform, data cleansing and transformations can be performed. On a small scale, data virtualization can be used very successfully during pilot/exploratory/prototyping phases to improve BI agility, shorten the learning curve, and influence what should be built in the data warehouse. It can also be used successfully to access data in small data volumes, and/or for infrequently needed data. Using a data virtualization approach has numerous challenges, most notably user query performance and reporting load on the source systems — therefore, a full-fledged logical data warehouse is likely necessary if data virtualization is utilized significantly across an enterprise.

Data Warehouse

Traditionally a data warehouse is a repository of enterprise-wide data which has been consolidated from multiple source systems, thus increasing the value of the data after it’s been correlated. The source data is cleansed, transformed, standardized, enriched with calculations, and stored historically to facilitate time-oriented analysis. The traditional data warehouse is a centralized database, separate and distinct from the source systems, which usually translates to some level of delay in the data being available for reporting and analysis. The level of effort in developing an end-to-end data warehouse can involve long development cycles, which has opened up opportunities for alternative methods for handling data integration and data access.

Data Warehouse Appliance

A DW appliance is a combination of both hardware and software which is optimized specifically for data warehousing and analytics at high scale. Most DW appliances are based on MPP (massively parallel processing) architecture for high performance operations.

Distributed Query

A query which returns results from multiple, disparate data sources. A federated query and a distributed query are synonymous.

Distributed Processing

Distributed processing is one aspect of a logical data warehouse implementation. It involves pushing down the processing effort to each distributed source system whenever possible in order to maximize performance via parallelism. There are numerous techniques utilized by vendors to improve the performance of distributed processing.

Hadoop

Hadoop is a broad-ranging set of open source Apache projects which form an ecosystem for distributed storage as well as distributed processing/computations for large amounts of data (i.e., Big Data). Hadoop has continued to expand its presence due to its ability to scale both storage and processing at lower cost. A Hadoop environment complements a data warehouse by offering an environment which can handle data exploration, processing for a variety of data types, advanced analytic computations, as well as a capable ETL alternative. Because the Hadoop ecosystem is extremely broad, there’s a myriad of options available for using Hadoop.

Lambda Architecture

A Lambda architecture is more about data processing than data storage. It’s an architectural pattern designed to process large data volumes using both batch and streaming methods. Batch processing is typically pull-oriented, whereas streaming data is push-oriented. A Lambda architecture implementation involves a batch layer, a speed layer, and a serving layer. The serving layer, which handles data access/consumption, may serve consolidated data which was ingested through the different batch and speed layers.

Logical Data Warehouse

A logical data warehouse (LDW) builds upon the traditional DW by providing unified data access to multiple platforms. Conceptually, the logical data warehouse is a view layer that abstractly accesses distributed systems such as relational DBs, NoSQL DBs, data lakes, in-memory data structures, and so forth, consolidating and relating the data in a virtual layer. This availability of data on various platforms adds flexibility to a traditional DW, and speeds up data availability. The tradeoff for this flexibility can be slower performance for user queries, though the full-fledged LDW vendors employ an array of optimization techniques to mitigate performance issues. A logical data warehouse is broader than just data virtualization and distributed processing which can be thought of as enabling technologies. According to Gartner a full-fledged LDW system also involves metadata management, repository management, taxonomy/ontology resolution, auditing & performance services, as well as service level agreement management.

Massively Parallel Processing (MPP)

An MPP system operates on high volumes of data in a parallel manner across distributed nodes. This “shared-nothing architecture” differs from an SMP system because each separate physical node has its own disk storage, memory, and CPU. Though an MPP platform has a lot of functionality in common with SMP, there are also many differences with respect to table design, handling distributed data, and alternative data loading patterns in order to take full advantage of parallelism. Most MPP systems are oriented towards DW/analytics/big data workloads, as opposed to transactional system workloads.

Master Data Management (MDM)

A master data management system is a set of tools and process for the purpose of managing, standardizing, and augmenting an organization’s key reference and descriptive information for subject areas such as customers, products, accounts, or stores.  Master data management is often thought of in two forms: analytical MDM, and operational MDM. The value of a data warehouse can be exponentially increased with skillful master data management.

Modern Data Warehouse

Characteristics of a modern data warehouse frequently include (in no particular order):

  • Capability of handling a variety of subject areas and diverse data sources
  • Ability to handle large volumes of data
  • Expansion beyond a single DW/relational data mart structure (to include structures such as Hadoop, data lake, and/or NoSQL databases)
  • Multi-platform architecture which balances scalibility and performance
  • Data virtualization in addition to data integration
  • Ability to facilitate near real-time analysis on high velocity data (potentially via Lambda architecture)
  • A flexible deployment model which is decoupled from the tool used for development
  • Built with agile, modular approach with fast delivery cycles
  • Hybrid integration with cloud services
  • Some DW automation to improve speed, consistency, and flexibly adapt to change
  • Data cataloging to facilitate data search and to document business terminology
  • Governance model to support trust and security
  • Master data management for curation of reference data
  • Support for all types of users, and all levels of users
  • Availability of an analytics sandbox or workbench area to facilitate agility within a bimodal BI environment
  • Support for self-service BI to augment corporate BI
  • Delivery of data discovery and data exploration, in addition to reports and dashboards
  • Ability to certify and promote self-service solutions to the corporate BI/analytics environment

NoSQL Database

Like a data lake, a NoSQL database is a schema-agnostic data storage option. NoSQL databases are well-suited to hierarchical/nested data with simple data structures (ex: property and its value). There are various types of NoSQL databases which suit different use cases, including: key-value DBs, document DBs, column family stores, and graph DBs.

Polygot Persistence

These days, we anticipate our BI and analytics environment to be comprised of multiple platforms (which is ideally transparent to users). Polygot persistence (aka “best fit engineering”) refers to the approach of using the most suitable data storage technology based on the data itself. For instance, multi-structured data is usually well-suited to a data lake or HDFS storage solution; log data in XML or JSON format may be suitable for a NoSQL solution; structured data aligns with a relational database. Using multiple technologies in an information management platform is a form of scaling out, and although it simplifies needs related to certain data types, it does add complexity to the overall data management solution. The more that polygot persistence methodology is followed, the additional likelihood data virtualization is of benefit due to varied infrastructure.

Schema on Read

Querying techniques such as “Schema on Read” have become prevalent because it applies structure at the time the data is queried, rather than at the time the data is written or initially stored (i.e., “Schema on Write”). The value of the Schema on Read approach is to be able to store the data relatively easily with less up-front time investment, then query the data “where it lives” later once a use case is defined. Schema on Read does offer significant flexibility which can offer agility in a modern data warehousing environment and allows us to learn and deliver value more quickly. Schema on Read is often associated with an ELT (Extract>Load>Transform) approach, which is common for big data projects. The implementation of Schema on Read is usually only a portion of the overall information management platform.

Schema on Write

Schema on Write refers to how a traditional data warehouse is designed. The dimensional data model is defined up front, its schema is created, then the dimension and fact tables are loaded with data. Once the data has been loaded, users can begin using it for reporting. This approach requires up-front data analysis, data modeling, and creating of data load processes, all of which can involve long development cycles which challenge business agility. Schema on Write is associated with the traditional ETL approach: Extract>Transform>Load.

Semantic Model

A semantic model extends a data warehouse with calculations, relationships, formatting, and friendly names which make sense to functional users. The semantic model is the primary interface for users, and can significantly improve the user experience for reporting, particularly for less technical users who are responsible for some level of self-service reporting and analysis. A semantic model can be a pass-through only (with some similar characteristics to data virtualization), or could be combined with cached data storage mechanisms (such as OLAP or in-memory analytical models).

Operational Data Store (ODS)

An ODS is a data storage repository intended for near real-time operational reporting and analysis. Whereas a data warehouse and data marts are usually considered nonvolatile, an ODS is considered volatile because the data changes so frequently. To reduce data latency, data transformations and cleansing are usually minimized during the ETL process to populate the ODS. An ODS typically contains minimal history.

Symmetrical Multiprocessing (SMP)

An SMP system is a traditional relational database platform in which resources such as disk storage and memory are shared. For very large-scale databases, when an SMP database begins to experience limits related to performance and scalability, an MPP (massively parallel processing) database may be an alternative.

Let’s block ads! (Why?)

Blog – SQL Chick