• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Generation

Is Your Business Ready for the New Generation of Analytics?

January 25, 2021   TIBCO Spotfire
TIBCO NewGenAnalytics scaled e1610467165517 696x365 Is Your Business Ready for the New Generation of Analytics?

Reading Time: < 1 minute

New year, new generation of analytics. The last year brought a lot of changes, including shifts within the analytics landscape to bring together capabilities that support agility and speed in decision making. This year, prepare your business for any unexpected changes and “bake in” agility and resiliency with real-time analytics.

Is your organization ready to meet the growing demand for robust, fast, and flexible analytics?

In an upcoming webinar with TIBCO, “Readying Enterprises for the New Generation of Analytics,” Dan Vesset of IDC will share recent research on what businesses need from their analytics teams and applications to support this new area of growth. 

Among the findings to be discussed are:

  • At least 70% of managers and directors surveyed said their executives need their enterprises to become more data driven
  • 49% of organizations surveyed say that they are very or substantially challenged by their inability to synthesize data into actionable information
  • More than half of the organizations surveyed planned to spend more on Big Data and Analytics, regardless of their stage in IDC’s Recovery Phase mapping.

During this 45-minute webinar, Vesset will engage in a discussion with TIBCO’s Director of Product Marketing, Lori Witzel, conduct a deep dive into next-generation analytics architectures, and provide directional guidance for organizations looking towards improved decision making through analytics. 

Move from Crisis to Recovery with Hyperconverged Analytics

Through an integrated approach to powerful visual analytics, embedded data science, and seamless streaming analytics, TIBCO Spotfire® enables modern analytics architectures across a variety of use cases and industries. This new modern approach—hyperconverged analytics—supports data exploration, data management, and AI-driven decision automation for improved business outcomes. 

Through an integrated approach to powerful visual analytics, embedded data science, and seamless streaming analytics, TIBCO Spotfire enables modern analytics architectures across a variety of use cases and industries. Click To Tweet

To ensure that your business benefits from this new generation of hyperconverged analytics, register for the webinar now. And for more findings and expert advice, download the full IDC report.

Let’s block ads! (Why?)

The TIBCO Blog

Read More

Google proposes applying AI to patent application generation and categorization

November 22, 2020   Big Data
 Google proposes applying AI to patent application generation and categorization

When it comes to customer expectations, the pandemic has changed everything

Learn how to accelerate customer service, optimize costs, and improve self-service in a digital-first world.

Register here

Google asserts that the patent industry stands to benefit from AI and machine learning models like BERT, a natural language processing algorithm that attained state-of-the-art results when it was released in 2018. In a whitepaper published today, the tech giant outlines a methodology to train a BERT model on over 100 million patent publications from the U.S. and other countries using open-source tooling, which can then be used to determine the novelty of patents and generate classifications to assist with categorization.

The global patent corpus is large, with millions of new patents issued every year. It’s complex as well. Patent applications average around 10,000 words and are meticulously wordsmithed by inventors, lawyers, and patent examiners. Patent filings are also written with language that can be unintelligible to lay readers and highly context-dependent; many terms are used to mean completely different things in different patents.

For all these reasons, Google believes that the patents domain is ripe for the application of algorithms like BERT. Patents, the company notes, represent tremendous business value to a number of organizations, with corporations spending tens of billions of dollars a year developing patentable technology and transacting the rights to use the resulting technology and patent offices.

“We hope that our [proposal] will help the broader patent community in its application of machine learning, including corporate patent departments looking to improve their internal models and tooling with more advanced machine learning techniques, patent offices interested in leveraging state-of-the-art machine learning approaches to assist with patent examination and prior art searching, machine learning and natural language processing researchers and academics who might not have considered using the patents corpus to test and develop novel natural language processing algorithms,” Google data scientists Rob Srebrovic and Jay Yonamine wrote in a blog post. “Patent researchers and academics who might not have considered applying the BERT algorithm or other transformer based approaches to their study of patents and innovation.”

As VentureBeat recently reported, businesses aren’t the only ones that stand to benefit from AI with regard to patent processing. The U.S. Patent and Trademark Office (USPTO) built AI models for different categories of patents and then trained the models on text from patent abstracts. Separately, the USPTO’s staff is using AI to more efficiently process patent applications. According to a spokesperson, the agency is now using a “leading RPA provider” to centralize its bot efforts and ensure a proper process and governance model that includes use cases, development, testing, and security before bots are deployed.

“We are working on adding AI tools to help route applications to examiners more quickly and to help examiners search for prior art,” Andrei Iancu, U.S. Under Secretary of Commerce for intellectual property, said in an emailed response to VentureBeat in October. “We’ve also been active on the Trademarks side, exploring the use of AI to help find prior similar images and to identify what we call fraudulent specimens. We are exploring using AI to improve the accuracy and integrity of the trademark register.”

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

The Journey of Generation A(X): Top 5 Highlights from Spotfire 10

June 27, 2020   TIBCO Spotfire
TIBCO Spotfire 1 696x464 The Journey of Generation A(X): Top 5 Highlights from Spotfire 10

Reading Time: 4 minutes

First introduced in the fall of 2018, TIBCO Spotfire®’s A(X) experience, an artificial intelligence (AI) driven analytics experience, delivers agile, augmented analytics for better, faster decision making. Building upon this momentum and listening to user feedback, TIBCO has prioritized several usability improvements and feature enhancements since the last Long-Term Supported (LTS) release (version 10.3) in mid-2019. 

 The Journey of Generation A(X): Top 5 Highlights from Spotfire 10
This Spotfire treemap illustrates all features released since 10.3 LTS by category, with Community users’ top requested features shaded in green).

Several of these new capabilities—from “Interactive AI” to streaming analytics in the web client to native Python—provide organizations with more accessible data exploration and quicker insights for everyone.  

Below are five highlights of the top technology advancements available by upgrading to the latest LTS release of Spotfire® version 10.10.   

1. Interactive AI: Insights on Marked Data

With a simple lasso of a data point, AI-assisted exploration answers the questions that you didn’t know how to put into words! Spotfire®’s Recommendations engine automatically identifies interesting patterns in your data, guiding even less-experienced analysts to the most significant relationships to explore. “Interactive AI” aids users by isolating root causes of potential patterns (e.g. an outlier, shifting negative trend).  

 The Journey of Generation A(X): Top 5 Highlights from Spotfire 10

Takeaway: Interactive AI allows even non-technical users to borrow the brain of a data scientist and create analytics applications in seconds flat—without any expertise—thus shortening the road to insight by simply asking, “What’s different about these?”   

2. Streaming Visual Analytics on the Web

Any business dealing with events data in terms of velocity, direction, and momentum can derive value from streaming visual analytics. Spotfire® provides the industry’s first true streaming business intelligence solution, offering one single visual analytics environment with built-in capabilities for diagnostic, predictive, and streaming analytics.  

Streaming visual analytics transforms what was once static, historical rear-view monitoring into an immersive environment for predictive and prescriptive analytics. Streaming visual analytics enables richer insights, served up through dashboards and analyses, that support taking better actions. This is essential because the speed of business today demands:

  • Adapting and responding to frequency of rapidly developing events and changing conditions 
  • Making urgent data-informed (at times automated) decisions based on deeper advanced analytics 

Streaming visual analytics transforms what was once static, historical rear-view monitoring into an immersive environment for predictive and prescriptive analytics. Click To Tweet

 The Journey of Generation A(X): Top 5 Highlights from Spotfire 10

Takeaway: Streaming analytics is not just delivering real-time data but also real-time analysis, blending with historical data to inform timely decision-making. Whether for logistics companies, manufacturers, or others seeking operational excellence and cost optimization, streaming visual analytics is imperative. Decision makers are better enabled to act on the most current state of the business when consuming real-time insights via “always-ON” web dashboards and apps.  

3. Native Python Data Functions 

Our feature in-focus, it’s a case of: you asked and we answered. By popular demand! 

With support for writing and managing native Python data functions, Spotfire® provides an interface between visual analytics and advanced analytics with very tight integration. Python has the highest rate of growth among coding languages and is the language of choice for many data scientists, so it was natural for TIBCO to introduce support in Spotfire® for native Python data functions across the Analyst client, web, and Automation Services. Many Spotfire® customers’ IT leaders channeled the voices of their data scientist teams with this request, as this frees them up from manual low-value tasks of duplicating data across different environments. This all allows for the same ease and convenience that users of TERR and “R” language bundled packages have enjoyed for many years in Spotfire®—now for Python.

[*NOTE for users on earlier versions: It was already possible to use Python data functions prior to 10.7 through the data function extension available on the TIBCO Community. However, there’s no longer a need to use that extension as Python data functions are now supported natively right within the environment.]  

Takeaway: One TIBCO partner endorsed this as, “…VERY complex analytics on the fly, without leaving [the] environment!” Writing and managing native Python data functions in Spotfire® consolidates several advanced analytics workflows—typically involving several tools—to cut the high cost of task switching. 

4. Powerful Location Analytics Enhancements

Spotfire® location analytics extends far beyond merely plotting markers on a map. It’s truly immersive and automates the process of creating maps with instant geocoding, enriching the context for spatial patterns and relationships within multiple data layers, map services, and GIS data. It unlocks the use of advanced spatial processing techniques leveraging geographic services and data functions for seamless location-based exploration.  

Spotfire®’s latest version now includes all of the following capabilities in map charts:

  • Leverage geo-reference info from GeoTIFFs to position images on maps automatically at the right location and with the right projection
  • Position images via drag and drop
  • Calculate geo-spatial distance for streaming data
  • Pan & zoom when Auto-Zoom is enabled
  • Ease of use and productivity improvements
 The Journey of Generation A(X): Top 5 Highlights from Spotfire 10

Takeaway: In a highly interactive visual application built upon embedded data science, instant recalculations of the models inform new sectors on-the-fly, offering the center of newly drawn areas accelerates the speed to insight. 

5. More and More Data Connectivity Added

Native self-service connectors support 59 data sources (and counting!) right in the Spotfire® platform. This list is continually growing, with native connectors to the most popular hybrid cloud databases like Teradata, to applications like SharePoint Online and Salesforce, as well as those really hot cloud analytics data warehouses like Google BigQuery and Snowflake that combine data warehousing with your data lake, just to name a few.

Spotfire® users also connecting to streaming data are able to subscribe to 100+ streaming sources available through Spotfire® Data Streams to blend real-time analysis alongside your historical data. 

Takeaway: Rich, disparate data is the core of any thorough analysis. TIBCO is always committed to adding new connectors—both native and custom—to Spotfire®, so customers gain an improved, holistic understanding of their business landscapes and markets.  

With these advanced capabilities, innovative organizations can redefine what’s possible with analytics and business intelligence. For a closer look at all of “What’s new in Spotfire®,” explore the recent Dr. Spotfire sessions. 

Let’s block ads! (Why?)

The TIBCO Blog

Read More

Google open-sources LaserTagger, an AI model that speeds up text generation

February 1, 2020   Big Data
 Google open sources LaserTagger, an AI model that speeds up text generation

Sequence-to-sequence AI models, which were introduced by Google in 2014, aim to map fixed-length input (usually text) with a fixed-length output where the length of the input and output might differ. They’re used in text-generating tasks including summarization, grammatical error correction, and sentence fusion, and recent architectural breakthroughs have made them more capable than before. But they’re imperfect in that they (1) require large amounts of training data to reach acceptable levels of performance and that they (1) typically generate the output word-by-word (which makes them inherently slow).

That’s why researchers at Google developed LaserTagger, an open source text-editing model that predicts a sequence of edit operations to transform a source text into a target text. They assert that LaserTagger tackles text generation in a fashion that’s less error-prone — and that’s easier to train and faster to execute.

The release of LaserTagger follows on the heels of notable contributions from Google to the field of natural language processing and understanding. This week, the tech giant took the wraps off of Meena, a neural network with 2.6 billion parameters that can handle multiturn dialogue. And earlier this month, Google published a paper describing Reformer, a model that can process the entirety of novels.

LaserTagger takes advantage of the fact that for many text-generation tasks, there’s often an overlap between the input and the output. For instance, when detecting and fixing grammatical mistakes or when fusing several sentences, most of the input text can remain unchanged — only a small fraction of words need to be modified. LaserTagger, then, produces a sequence of edit operations instead of actual words, like keep (which copies a word to the output, delete (which removes a word), and keep-addx or delete-addx (which adds phrase X before the tagged word and optionally deletes the tagged word).

Added phrases come from a restricted vocabulary that’s been optimized to minimize vocabulary size and maximize the number of training examples. The only words necessary to add to the target text come from the vocabulary alone, preventing the model from adding arbitrary words and mitigating the problem of hallucination (i.e., producing outputs that aren’t supported by the input text).  And LaserTagger can predict edit operations in parallel with high accuracy, enabling an end-to-end speedup compared with models that perform predictions sequentially.

Evaluated on several text generation tasks, LaserTagger performed “comparably strong” with, and up to 100 times faster than, a baseline model that used a large number of training examples. Even when trained using only a few hundred or a few thousand training examples, it produced “reasonable” results that could be manually edited or curated.

“The advantages of LaserTagger become even more pronounced when applied at large scale, such as improving the formulation of voice answers in some services by reducing the length of the responses and making them less repetitive,” wrote the team. “The high inference speed allows the model to be plugged into an existing technology stack, without adding any noticeable latency on the user side, while the improved data efficiency enables the collection of training data for many languages, thus benefiting users from different language backgrounds.”

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Paving the Way for the Next Generation of CX Services

December 20, 2019   CRM News and Info

Nowadays, the quality of customer experience determines the success of a business. In order to deliver superior customer experience, companies should combine human talent with technology, including much-needed digital expertise, to turn their current revenue and growth strategies into customer satisfaction. The companies that truly succeed engage their customers and strengthen their brands.

Technology has changed consumer and business needs and expectations. Companies must be ready to offer customer services anytime, anywhere, across multiple channels and at high speed, using various current and emerging technologies while, importantly, preserving the human touch.

A New Era for Customer Experience

In today’s business environment, a top priority for companies is their digital transformation. Most leaders understand the value of investing in new technologies and systems, particularly in terms of cost and efficiency. However, it is equally important that they focus on improving their business processes, while leveraging the talent of their people and staying consistent with their brand promise, which is reflected in the overall customer experience.

Digital transformation requires the careful implementation of multiple projects and a combination of technologies that will allow companies to compete successfully in the digital era. At a time when customers’ needs, expectations and demands constantly are changing, technological trends such as automation of routine processes, data analytics and cognitive technologies rapidly are shaping the new customer experience.

Two-thirds of the CEOs of the largest 2,000 global companies will shift their focus before the end of 2019 from traditional offline strategies to more modern digital strategies aimed at improving customer experience, according to IDC.

In order to increase client satisfaction and deliver new efficiencies in customer-related business processes, companies must invest in next-generation customer services.

For instance, the combination of online and mobile channels with personalized services is strategic to increasing or retaining customers, while making business processes more efficient and reducing costs.

Therefore, the development of multichannel platforms that effectively combine traditional and digital channels allows enterprises to offer a better customer experience that generates higher value for consumers as well as greater efficiency.

Beyond putting in place multichannel platforms, businesses need to ensure that these platforms not only offer a seamless and consistent experience across channels, but also provide a holistic view of the experience being delivered at any time during the customer lifecycle and across all customer interactions. This “integrated multichannel” characteristic of a multichannel platform is a competitive advantage. The ability to define and implement an integrated multichannel will be an important source of efficiency for a company, and a key differentiator when engaging with its customers.

There are other components within next-generation customer services that are equally important. The automation of routine customer-related processes creates greater effectiveness and allows human agents to devote more of their time to high value-added tasks, while reducing response and resolution times.

Using analytics to capture client information enables companies to manage and develop insights so they can deliver a hyper-personalized customer experience, one that delights the customer and fulfills the brand’s promise.

These are just a few examples of how next-generation customer experience services and capabilities define the way brands communicate with consumers and the way businesses deliver efficiencies.

In order to maximize the opportunities offered by digital transformation, companies need to adopt new approaches and strategies to help them better perform digital tasks that involve data digitalization, simplification, agility and analysis. Equally important is that end-to-end processes addressing the entire customer lifecycle will generate higher value for companies and better experiences for their customers.

Leading the Next Generation of CX

The shift from traditional voice to digital channels has accelerated, boosted by next-generation technologies and value-added capabilities, all of which are shaping the industry’s future. So how can businesses lead in the creation of the next generation of customer experiences?

There are three key areas with significant applications now and in the future, as well as four next-generation capabilities that underpin these applications. The current three are high-value voice (or complex voice services), integrated multichannel, and automated back office. The future four are artificial intelligence and cognitive technologies, analytics, automation, and customer experience consulting.

“By 2022, 72 percent of customer interactions will involve an emerging technology, such as machine-learning applications, chatbots or mobile messaging, up from 11 percent in 2017,” Gartner reported this spring.

Artificial intelligence can be used to deliver sentiment analysis and, most importantly, more humanized and better customer interactions.

  • Enterprises increasingly are spending on areas such as customer analytics to enhance the quality of services delivered, and to ensure client retention. It’s well known that data science can improve business efficiency, but generating value through data also can take the form of a better customer experience.
  • “By year-end 2022, 85 percent of large and very large organizations will have deployed some form of RPA (robotic process automation),” Gartner predicted last year.

    Accelerating the automation of redundant back- and front-office work to improve efficiency and customer experience will serve as a key differentiator for businesses.

  • CX Process Consulting is focused on optimizing customer journeys and business processes to provide a differentiated CX and distinguish a brand. There is a growing number of businesses that are turning to providers to help build or extend a competitive advantage through CX.

    Digital Customer Experience within the Digital Business Consulting Services Market is expected to represent approximately 24 percent of the total market in 2023 globally, Gartner predicted this fall.

The future success of companies in customer relationship management will depend on the powerful combination of the best that technology has to offer with superior talent to help them deliver exceptional customer experiences to differentiate and strengthen their brands.
end enn Paving the Way for the Next Generation of CX Services


Carlos%20L%C3%B3pez Abad%C3%ADa Paving the Way for the Next Generation of CX Services
Carlos López-Abadía is chief executive officer and member of the board at
Atento.

Let’s block ads! (Why?)

CRM Buyer

Read More

Solutions Journalism Network Transforms Newsrooms by Educating the Next Generation of Journalists

November 5, 2019   NetSuite
gettyimages 487040098 Solutions Journalism Network Transforms Newsrooms by Educating the Next Generation of Journalists

Posted by Barney Beal, Content Director

Headlines focusing on crime, inequality, war and pollution proliferate in news outlets today. Positive responses to those problems that are happening all over the world often get lost in the shuffle.

However, there’s a movement underway to change how news is presented. Solutions Journalism Network, an independent nonprofit organization, is leading that charge.

Founded in 2013 by world-renowned journalists from the New York Times, Solutions Journalism Network advocates for evidence-based reporting linking problems to solutions in newsrooms and journalism schools all over the U.S. The organization provides training to editors, journalists and university professors in an effort to change the way news is delivered to the public.

“We’re affecting the mass of new journalists who are turning out through colleges and universities,” said Maurisse Johnson, Chief Financial Officer of Solutions Journalism Network. “They graduate and go to newsrooms, magazines and media outlets, and produce solutions journalism, thus changing the old ways of newsrooms.”

One very visible example of solutions journalism at work is coverage around voter fraud. An article focused on solutions journalism would mention that voter fraud exists while also showcasing ways global communities work to combat the practice.

Solutions Journalism Network is funded primarily by grants from foundations and donations from corporations and individuals. In the last two years they’ve experienced massive growth: a 40% increase in grant funding in just 24 months.

“Our mission has remained the same throughout, but our administrative burden exploded,” Johnson said. “We received an increase in the number of grants, the average grant size doubled, and we had to increase headcount accordingly.”

Prior to the surge in grants, Solutions Journalism Network used a combination of desktop accounting software and an outside accounting firm to manage their books. But as the organization grew at breakneck speed, so too did the complexity of its finances.

“We were at an inflection point where our organization had matured beyond external management of our books,” Johnson said. “As grants grew in both size and volume, we had a bigger responsibility to provide reporting and answer questions for funders on an ad hoc basis.”

At that time, Johnson undertook a business process review that lasted about six months. He made sure he thoroughly understood the business model, the people and the grant functions, so he could build a new process around all three.

Specific requirements for nonprofit accounting functions aided Johnson in narrowing down platforms for review. His choice of NetSuite was based on a combination of pre-built reports customized for nonprofits, the ease of closing out sub-ledgers, straightforward allocation tracking, and a pre-established trust in NetSuite based on years of experience.

“I have a history with Oracle that’s built on trust,” Johnson said. “I know that Oracle is going to put their efforts towards the development of NetSuite and offer me a robust support team. If something goes awry, Oracle will invest resources in it.”

Solutions Journalism Network went through a four-month implementation that included on-site builds – a huge benefit for Johnson and his team. NetSuite went live in January of 2019, including a direct connection to Solutions Journalism Network’s expense management platform.

Roughly one-third of their 40+ employees use it in some capacity, with plans to increase that percentage in the coming years.

“Dashboard reporting for nonprofits is already configured in NetSuite,” Johnson said. “We didn’t have to spend time building it. We don’t have to close out individual ledgers every month. And the allocation tracking in NetSuite is quite sophisticated. Grants dictate salary coverage and the allocations change every year. Being able to articulate this in a financial system (rather than manually) is huge.”

With the launch of NetSuite, Solutions Journalism Network has experienced time savings, increased confidence, and simplification across the organization. Today, Johnson can close the books in less than an hour. Auditors, board members and other financial professionals immediately have a level of trust when they learn that Solutions Journalism Network is using NetSuite. And given the ease-of-use, Johnson is able to leverage the data in NetSuite for interactive, organization-wide conversations.

“With NetSuite, we can share data across our organization with an ease we’ve never experienced before,” Johnson said. “Previously, we had static copies of accounting data that we’d send to stakeholders. Now, we can have a web meeting with a program manager and go through their P&L, asking meaningful questions we didn’t have access to before.”

The flexibility of financial reporting provides a big bonus as well. Before NetSuite, Johnson would have to manually run one report query across 50 different grants, which makes a big difference when you have limited or no IT staff. Now he can just apply a grant-specific filter in NetSuite to create each report. And, individual users can access reporting on their phones via an app – making it more easily digestible and accessible for all.

For the future, Johnson plans to work towards integrating Solutions Journalism Network’s customer relationship management (CRM) platform and payroll providers. His goals are to connect initial grant conversations in the sales cycle all the way to incoming grant funds, and tie payroll directly to financial data as well.

“We want to be able to answer questions like, how many foundations are we engaged with for over six months, what percentage produce dollars after six months, and what’s the churn rate,” Johnson said. “NetSuite is so robust in terms of creating KPIs, and we expect these to be viable metrics once we make the connection in the next year.”

Learn how other nonprofits like Solutions Journalism Network streamlined business processes using NetSuite.

Posted on Mon, November 4, 2019
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

Read More

From Maps To Apps: Why We Need A New Generation Of ERP

September 18, 2019   SAP
 From Maps To Apps: Why We Need A New Generation Of ERP

Those of us of a certain age remember the AAA paper road maps and the hassle of unfolding and refolding them after every use. Or following directions that referred to landmarks like local pubs and corner stores. Perhaps, if you are younger,  the first GPS systems were your “map” of choice.

At the time, those electronic mapping devices were considered amazing innovations that showed your location so you could get to your final destination without getting lost. They would keep track of where you were and where you’ve been. Most impressively, they would keep up with you, turn for turn, and reprocess your position to find a new way home.

No sooner did people think maps couldn’t get better, a range of apps proved them wrong. Waze is connected to vehicles continuously to display – in real-time – where traffic is building, whether there’s a traffic jam or accident, and which route is the best to take. It’s undeniably fast, constantly analyzing traffic and historical time-of-day patterns.

The power of new architectures

The beauty behind the evolution of maps is the use of a different architecture. No longer static, poster-sized papers or books, maps are now actively listening, understanding, and recommending alternatives intelligently. And behind this transformation is fast connectivity, deep geospatial and predictive analytics, constantly refining patterns, and artificial intelligence and machine learning that are detecting patterns and predicting behavior.

The same is the case for ERP systems. The transactions and movements they enable are anything but a small piece of a much larger puzzle. Their real speed and intelligence come from the new architecture. It’s invisible, but its power is enormous and game-changing. And before we know it, ERP systems change again to help us live better, become better stewards of our planet, and navigate life in new ways.

Like next-generation navigation systems, ERP systems are becoming highly collaborative in ways that are beyond “data traffic.” What if you knew how the fire in the Amazon could impact products made from wood and paper? Shouldn’t you predict the resulting changes on everything from supply chain costs and raw material availability to pricing strategies and on-time fulfillment of consumer demand? Better yet, how will finding alternative materials help you avoid losing sales and increasing prices?

Large companies have long seen value from this type of intelligence. They devote entire departments and significant resources to gain an edge by monitoring patterns. Now small and midsize businesses can tap into this same collaborative, connected, and cognitive experience with their ERP systems.

A better world and a competitive edge

For over a decade, traffic management using intelligent technology has improved how we plan our journeys to conserve fuel, minimize travel time, and reduce repair costs. And this is on top of the potential freedom people will experience as autonomous, navigation-enabled vehicles become commonplace.

A similar argument can be made for the evolution of ERP systems. Built on an intelligent, deeply predictive, and accurate in-memory architecture, the technology will enable people, businesses, and industries to adapt quickly, focus on the next innovation, and reach their ultimate destination on the smoothest path available.

SAP S/4HANA is built on an intelligent in-memory platform that takes it beyond the recordkeeping systems of yesterday’s ERP to a new and exciting world. Learn more by viewing the infographic, “Navigating the 7 C’s of Supply Chain Automation.” 

Let’s block ads! (Why?)

Digitalist Magazine

Read More

Millennials may be the last generation to know so little about their health

February 4, 2019   Big Data
 Millennials may be the last generation to know so little about their health

One of the problems of the Second Artificial Intelligence (AI) Winter (1987-1993) was that there was not enough data to go around. We did not yet understand the value of “Big Data,” and academia was working on models fueled by “Small Data.” However, now we have entered the era of “Bigger Data Than We Ever Imagined.” We are producing data on the order of exabytes. It is predicted that by 2020 we will use zettabytes. By 2024, yottabytes are in sight.

Hyperscalers are frantically expanding their clouds to meet the demands of the 21st century, and the whole planet will become one giant hard disk attached to, hopefully, a capable calculator. Exascalers, on the other hand, are concerned with “Fast Data”, the application of big data analytics to smaller data sets in near-real-time. They are racing beyond the petaflop to reach exaflop speed by 2021.

Both Big Data and Fast Data have already drastically reconfigured the landscape of the Financial Times Global companies by market cap.

In 2009, there were no information technology companies in the FT Global 500 top 5. However, a mere nine years later, all companies in the FT Global were information technology companies, which took over from big energy. Most importantly, all of these companies are in the process of pivoting towards AI. One of them is Tencent, the first Chinese information technology company to enter the top 5 (admittedly, this only happened for Q1. The company later lost its position after its earnings call in Q2).

However, a new data inflection point is coming, and it will be driven and influenced by the pressing need in healthcare. 2018’s FT Global top 5 has undoubtedly posed two questions: Which industries provide the most data, and which ones need the most speed? They all arrived at the same conclusion: medical and life sciences.

Why?

Because in medicine and life sciences, there are power laws at work that dwarf Moore’s Law. In the USA, health data doubles every 73 days, estimated to arrive at a hefty 2.3 zettabytes by 2020. This is a hockey stick scenario that the information technology companies cannot afford to miss. The 3 Super-A’s (Amazon, Alphabet, and Apple) all announced a ground-breaking restructuring of their divisions and far-reaching collaborations (ABC-Amazon, Berkshire, Chase) to jump on the “big and fast data” bandwagon. This could produce surprising ripples in which companies hit the top of the FT’s Global 500 list in the near future.

By 2020, some of the information technology companies won’t be information technology anymore but Life Technology companies, pointing their search and intelligence engines towards healthcare and life science. This poses a legitimate challenge to the current biopharmaceutical and healthcare companies with their dwindling intellectual property assets and lackluster efficiency of discovery. Interestingly, by Q3 of 2018, Facebook lost its spot to Berkshire Hathaway, a well-known investment company focused on reinsurance of the healthcare and life insurance industry. It is entirely possible that, by 2025, most of these companies that merge technology and healthcare could be mandated by antitrust laws to break up into information technology and life technology corporations.

However, the confluence of big and fast will also turn the tables and disrupt who owns what. Today, big and fast data lies in the hands of the privileged few, many in the top 5 or top 10 of the FT Global. However, in 2025, we will be living in a drastically different world from the one we live in today, and individuals will own their own information, in healthcare or otherwise. The first generation that will thoroughly enjoy their personal privileges of Life Data will be Generation Beta (2025 to 2039). These will be the children of Generation Z (with Z we will have exhausted the Latin alphabet and returned to the Greek).

All information, coupled with accessible calculators, will belong to its rightful owner, who will be free to do with it whatever they want: move it, sell it, share it, donate it, add to it, maintain it, remix it, delete it and create it again. New personal devices will be designed to collect, store, encrypt, and manage personal and structured medical data on an opt-in basis. As a result, users themselves will have become “the edge” and consider themselves nodes. All medical data will be sequenced by then. Hence, all medical information will be encrypted and governed by a “smart contract” – contracts as computer code — that will stipulate when and by whom it can be accessed.

New economies will be born, and medical information will become an alternative financial asset, with individuals receiving dividends on their data and due monetary benefit for participation in massive clinical trials and prospective health studies. Individuals will not be passive donors of their information but will be incentivized to be more engaged participants in research, potentially in a planetary medical data project.

Medicine will finally break out of its silos. Individuals will have the choice between “portfolio managers” who will represent the data of their clients to potential patient recruiters in return for a real-time dividend. Individuals will be kept updated ubiquitously with research on their condition. Algorithms — fast data — will continuously monitor new and upcoming trials tailored for the individual, consisting of cohorts with common diagnostics, genetics, environment, family history, and predicted clinical trajectories. The confluence of this data will finally make research understandable to individuals and elusive etiologies less obtuse and accessible.

Gen B will be born in an always-on world with quantum computing algorithms and their own AIs that harness the advances of big and fast data. Everything around them will be autonomous and smart. They will have edited genomes that will make them no longer our descendants but ancestors in their own right, ready to become the first space-faring civilization.

Walter De Brouwer is cofounder and CEO of doc.ai. He is a Belgian-born technology entrepreneur with over 27 years of experience, a Fellow of the Royal Society of Arts, and served as President of RSA Europe from 2006 to 2008. He is a member of TED, curator of TEDxBrussels, and was a distinguished lecturer at the National Science Foundation in 2013. His article, “How the People Are Taking Over the World,” was among Techonomy’s Most-Read Articles of 2014 and was cited by its editors as “perhaps the most philosophical of Techonomy’s top articles” that year. Prior to doc.ai, he was Founder and CEO of Scanadu Inc., a $ 57 million venture-backed mobile health company.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Adding Value to Microsoft Dynamics 365 with Document Generation

January 26, 2019   Microsoft Dynamics CRM

Automation has always been a popular concept. Once a staple of science fiction lore, it’s now become part of modern daily lives. One example is office automation which began back in the day with basic document management.  It wasn’t long ago that everyone used typewriters and filing cabinets to manage their office documents, until PCs emerged in the 1980s. Microsoft Word eventually became the norm, followed by landmark products Microsoft Office in 1990, Office 2003, and eventually more sophisticated present-day technology like Dynamics 365.

The Case for Document Generation

There’s no question that a CRM can help your company with sales, marketing, customer service, and more but what about document generation? This is where Xpertdoc comes into play.  When integrated with CRM, Xpertdoc fully automates the processes for the generation, management, storage and delivery of business documents.  Not only can you design document templates but business users can visually model and deploy document workflows leveraging data from any available source (like Dynamics 365), addressing the most complex document scenarios while eliminating the need for technical knowledge or coding skills.

Generating documents in 4 easy steps

template 300x194 Adding Value to Microsoft Dynamics 365 with Document Generation

The very first step of the document generation process is connecting your data source like MS Dynamics 365 for example. Once that is done, you’ll need to create a data set. Not to worry, it’s not as technical as it sounds. A data set is just a collection of business data in a database. Xpertdoc Smart Flows for Dynamics 365 features an updated Data Set Builder with a modern user interface and easy navigation, i.e. no programming skills required.

Now that you’ve got your data set, it’s time to design your template using our Microsoft Word-based Template Designer (check out a previous blog post on best practices for template design) The ability to combine data sets from multiple sources into a single template allows you to tie up loose ends in one fell swoop. You decide to create customized templates for different regions, which your sales force can access via a dashboard. With your customized template, you can auto-fill information for your repeat customers, which saves time and creates a more efficient customer experience (CX). Once your template’s in place, you’re all set to run a document flow.

flow 300x213 Adding Value to Microsoft Dynamics 365 with Document Generation

There are more wide-reaching benefits of a document generation system, such as content search capabilities, e-signature support and more. You can archive templates, creating a personal library of enterprise information. The Single Sign-on authentication feature of Xpertdoc Smart Flows eliminates the need to remember passwords, providing a fast and easy connection. And speaking of connecting, you can enjoy flexible deployment options, either onsite or in a public or private cloud.

A modern document generation tool like Xpertdoc provides more automated capabilities and a competitive business advantage than old fashioned document management.

Smart Flows for D365 Free Trial V2 625x313 Adding Value to Microsoft Dynamics 365 with Document Generation

To learn more about Xpertdoc Smart Flows for Microsoft Dynamics 365, request a free 30-day trial and follow us on Twitter: @xpertdoc or visit www.xpertdoc.com for more information.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

A Better, Faster Alternative to SSRS Reports with Xpertdoc Document Generation

December 21, 2018   CRM News and Info
CRM Blog A Better, Faster Alternative to SSRS Reports with Xpertdoc Document Generation

Many business folks struggle with Microsoft SQL Server Reporting Services (SSRS) – a server-based report generating software system that has been around for a while. In this age of digital transformation and automated report generation, it can make simple report generating tasks much harder and more complicated than they have to be, partially because it is meant for developers. But there is a much more user-friendly, modern alternative.  Enter Xpertdoc Smart Flows.

What are Xpertdoc Smart Flows?

Xpertdoc Smart Flows for Microsoft Dynamics 365 are easy-to-configure, automated processes for the generation, management, storage and delivery of business documents. With the embedded Smart Flow Builder, business users can visually model and deploy flows that leverage data from any available source, addressing the most complex document scenarios while eliminating the need for technical knowledge or coding skills.

So how do Smart Flows for Dynamics 365 zap the hassle of of dealing with SSRS reports?

  1. Anyone can use Xpertdoc Smart Flows: Who likes writing SQL sub theories? Not me! And not you, either, unless you are a developer and you do that for your job. Xpertdoc Smart Flows for Dynamics 365 are for business users. You don’t have to know SQL code – or any code, for that matter. Instead, you would start with the intuitive platform, create your data set, build your template using “drag and drop” functionality, and start your document flow.
  2. Xpertdoc Smart Flows are user-friendly and convenient: Our Smart Flows for Dynamics 365 offer dashboard-style reporting, so you always know where you are at a glance. SSRS has no such thing, as you are expected to have developer-level skills and interpret code for that information. Xpertdoc Smart Flows’ user interface is both web and mobile-ready. SSRS requires more work to run on mobile devices, as the capability isn’t built-in. Specifically, you would have to implement SQL Server Mobile Publisher. This is not for the average bear.
  3. Xpertdoc Smart Flows can combine multiple data sources: SSRS has limitations aggregating and sorting data, which makes getting the data into the documents for report generation problematic. Xpertdoc Smart Flows for Microsoft Dynamics 365, by contrast, can combine multiple data sources, so data can be moved from point A to point B. This functionality is useful when creating document flows, because it ensures data from different sources are routed into the automated document reports without a hitch.
  4. Xpertdoc Smart Flows don’t have upgrade issues: Despite an update in 2016, SSRS is hard to upgrade. So hard that when the subject comes up, most folks just start looking around for other options. In addition, the interface, with its outdated visualizations, is probably nearing end of the line, too. 2016 wasn’t yesterday. Our Smart Flows are new, and connected to Microsoft Dynamics 365, so upgrade issues are manageable if they come up at all.
  5. Xpertdoc Smart Flows fit in your system: SSRS can be a space hog, or “resource intensive”, meaning it can use up a lot of server resources. Xpertdoc Smart Flows for Dynamics 365 cause no system strain.

With their ease of use and powerful modern capabilities, our Smart Flows for Microsoft Dynamics 365 are definitely the better alternative for any business user.

To learn more about Xpertdoc Smart Flows for Microsoft Dynamics 365, request a free 30-day trial and follow us on Twitter: @xpertdoc or visit www.xpertdoc.com for more information.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More
« Older posts
  • Recent Posts

    • Accelerate Your Data Strategies and Investments to Stay Competitive in the Banking Sector
    • SQL Server Security – Fixed server and database roles
    • Teradata Named a Leader in Cloud Data Warehouse Evaluation by Independent Research Firm
    • Derivative of a norm
    • TODAY’S OPEN THREAD
  • Categories

  • Archives

    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited