• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Systems

Researchers propose using the game Overcooked to benchmark collaborative AI systems

January 15, 2021   Big Data

The 2021 digital toolkit – How small businesses are taking charge

Learn how small businesses are improving customer experience, accelerating quote-to-cash, and increasing security.

Register Now


Deep reinforcement learning systems are among the most capable in AI, particularly in the robotics domain. However, in the real world, these systems encounter a number of situations and behaviors to which they weren’t exposed during development.

In a step toward systems that can collaborate with humans in order to help them accomplish their goals, researchers at Microsoft, the University of California, Berkeley, and the University of Nottingham developed a methodology for applying a testing paradigm to human-AI collaboration that can be demonstrated in a simplified version of the game Overcooked. Players in Overcooked control a number of chefs in kitchens filled with obstacles and hazards to prepare meals to order under a time limit.

The team asserts that Overcooked, while not necessarily designed with robustness benchmarking in mind, can successfully test potential edge cases in states a system should be able to handle as well as the partners the system should be able to play with. For example, in Overcooked, systems must contend with scenarios like when a plates are accidentally left on counters and when a partner stays put for a while because they’re thinking or away from their keyboard.

 Researchers propose using the game Overcooked to benchmark collaborative AI systems

Above: Screen captures from the researchers’ test environment.

The researchers investigated a number of techniques for improving system robustness, including training a system with a diverse population of other collaborative systems. Over the course of experiments in Overcooked, they observed whether several test systems could recognize when to get out of the way (like when a partner was carrying an ingredient) and when to pick up and deliver orders after a partner has been idling for a while.

According to the researchers, current deep reinforcement agents aren’t very robust — at least not as measured by Overcooked. None of the systems they tested scored above 65% in the video game, suggesting, the researchers say, that Overcooked can serve as a useful human-AI collaboration metric in the future.

 Researchers propose using the game Overcooked to benchmark collaborative AI systems

“We emphasize that our primary finding is that our [Overcooked] test suite provides information that may not be available by simply considering validation reward, and our conclusions for specific techniques are more preliminary,” the researchers wrote in a paper describing their work. “A natural extension of our work is to expand the use of unit tests to other domains besides human-AI collaboration … An alternative direction for future work is to explore meta learning, in order to train the agent to adapt online to the specific human partner it is playing with. This could lead to significant gains, especially on agent robustness with memory.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Teradata Named a Cloud Database Management Leader in the 2020 Gartner Magic Quadrant for Cloud Database Management Systems

January 10, 2021   BI News and Info
teradata logo social Teradata Named a Cloud Database Management Leader in the 2020 Gartner Magic Quadrant for Cloud Database Management Systems

Teradata Vantage Also Ranked Highest in Three Out of Four Use Cases in the 2020 Gartner Critical Capabilities for Cloud Database Management Systems for Analytical Use Cases

Teradata (NYSE: TDC), the cloud data analytics platform company, today announced it has been recognized as a Leader in the 2020 Gartner Magic Quadrant for Cloud Database Management Systems, issued November 23, 2020 by analysts Donald Feinberg, Merv Adrian, Rick Greenwald, Adam Ronthal and Henry Cook. Gartner evaluates vendors placed in the Magic Quadrant for Cloud Database Management Systems on completeness of vision and ability to execute following detailed research.

In tandem, Teradata Vantage – the company’s hybrid multi-cloud data analytics software platform – was also recognized with the highest scores in three out of four use cases in the 2020 Gartner Critical Capabilities for Cloud Database Management Systems for Analytical Use Cases, issued November 24, 2020 by analysts Henry Cook, Donald Feinberg, Merv Adrian, Rick Greenwald, and Adam Ronthal. Among the 16 vendors evaluated, Teradata ranked highest in three out of four analytical uses cases – Traditional Data Warehouse, Logical Data Warehouse, and Data Science Exploration/Deep Learning – and ranked second in Operational Intelligence.

“Teradata is committed to providing the best enterprise data analytics in the cloud – period. This means offering our customers a modern data analytics platform that can handle the large and complex workloads that Teradata is known for, and flexible deployment options that don’t limit choice or lock them in,” said Steve McMillan, CEO, Teradata. “We’re the only data warehouse and analytics provider with hybrid multi-cloud offerings across the top three public cloud vendors, providing the same software, features, and experience regardless of environment. This recognition from Gartner validates the strength of our cloud position and further underscores the commitment to our customers to meet them wherever they are on their cloud journey.”

Teradata’s unique hybrid and multi-cloud offerings have become critical differentiators in the Cloud Database Management Systems market. According to Gartner, “The capability to work across intercloud, multicloud and hybrid is increasingly important. This will likely become a prerequisite for these systems.”

Gartner defines the Cloud Database Management Systems (DBMS) as a fully provider-managed public or private cloud software system that manages data in cloud storage. These management systems include specific optimization strategies designed for supporting traditional transactions and/or analytical processing covering one or more of the following seven use cases: Traditional and Augmented Transaction Processing; Traditional and Logical Data Warehouse; Data Science Exploration/Deep Learning; Operational Intelligence; and Stream/Event Processing. Data is stored in a cloud storage tier (e.g., cloud object store, HDFS or other proprietary cloud storage infrastructure), and may use multiple data models — relational, non-relational (document, key-value, wide-column, graph), geospatial, timeseries and others.

Teradata Vantage is the leading hybrid multi-cloud data analytics software platform that enables ecosystem simplification by unifying analytics, data lakes and data warehouses. With Vantage, enterprise-scale companies can eliminate silos and cost-effectively query all their data, all the time, regardless of where the data resides – in the cloud, on multiple clouds, on-premises or any combination thereof – to get a complete view of their business. And now, Teradata is the only data warehouse and analytics provider with hybrid multi-cloud offerings across the top three public cloud vendors, offering maximum flexibility and choice to its enterprise customers.

To learn more visit here.

Gartner Disclaimer
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Let’s block ads! (Why?)

Teradata United States

Read More

How Cloud Systems Can Help Re-Invent Insurance Brands

August 2, 2020   TIBCO Spotfire
TIBCO Cloud NativeInsurance scaled e1596061301108 696x365 How Cloud Systems Can Help Re Invent Insurance Brands

Reading Time: 4 minutes

This article was originally published on Insurance-Edge.Net on June 11, 2020.

With global Insurtech revenue expected to top $ 10 billion by 2025, traditional insurers must start examining ways of modernising their systems to remain relevant for today’s digital-savvy customers. This is no small feat given the complexity of the technology environments of financial services companies, and the reliance of incumbents on monolithic architecture that have primarily remained on-premises despite the appeal of cloud-based solutions.

Legacy Limitations

Complicating a potential migration to more digital-centric environments is the fact that many on-site services integrate into third-party offerings. For example, the public vehicle register of a government. Any development or change to the environment, no matter how small, requires significant effort. Insurers are also limited in scaling their solutions horizontally (by adding more machines) or vertically (by adding more computational power or memory to existing machines).

Any additional load will, therefore, require significant investment in more physical and people resources to support it. To continue operating effectively, an organisation’s resources must be allocated around the clock. But this is where the challenge lies; by having to dedicate so much attention to maintenance due to the rigidity and size of existing legacy architecture, an insurer is left with precious few spare resources to focus on and drive innovation.

Typical disadvantages of such a traditionally focused environment include the following:

As engagement with customers increases across insurance touchpoints – think websites, call centres, chatbots, and the like – traditional systems cannot provide the performance improvements required to operate effectively making the systems difficult to scale.

The inability to fully embrace a multi-channel environment inadvertently places existing systems under strain and requires significant implementation efforts to review these monolithic architectures. As the systems are inflexible, there is limited possibility of allowing for a hybrid approach that connects on-premises sources with cloud-native ones.

Traditional environments have no native support for containers, a standard unit of software that packages code and its dependencies, therefore, their benefits that include speed of application delivery and reliability as well as dynamic scaling functionality, won’t be achieved.

With no native support for the practice of DevOps that advocates an agile relationship between development and IT operations, teams aren’t able to act or perform cohesively. There is no possibility of fast prototyping, resulting in a low or slow time-to-market.

Making the Move

However, by moving to a cloud-native architecture, an insurer opens itself up to a myriad of new possibilities. These born in the cloud architectures are designed to maximise the benefits of a distributed platform. This means that an insurer can focus more on its strategic business objectives and less on maintaining legacy systems.

By moving to a cloud-native architecture, an insurer opens itself up to a myriad of new possibilities. The insurer can focus more on its strategic business objectives and less on maintaining legacy systems. Click To Tweet

However, it is not recommended to “rip and replace” an entire legacy system or applications either. When it comes to cloud adoption, insurers should focus on quick wins, using quick sprints. Typically, this approach is more successful than one that seeks to drive massive change over a more extended period with a big-bang approach.

Fortunately, the capabilities of Platform-as-a-Service (PaaS), Infrastructure-as-a-Service (IaaS), and Software-as-a-Service (SaaS) providers mean that it is possible for an insurer to incrementally move new business services to the cloud and still keep its on-premises services running. It also opens the possibility of creating hybrid architectures that enable innovation to be introduced without disrupting the current business environment.

An insurer can, therefore, reinvent itself at its own pace while still being able to deliver an improved customer experience. This is critical when aiming to create differentiation over competitor Insurtechs, who already service customers using the channels they prefer. However, an insurer must leverage its reputation and experience in the industry and then couple it to this enhanced customer experience so as to create a stronger position for itself in the market.

Key Considerations

An insurer should consider commencing a cloud journey by integrating its internal services with an API-led approach. This provides the foundation to develop applications that can be customised more effectively to deliver a faster return than more traditional ones.

New developments should be moved into the cloud by creating microservices that can easily scale with the new load it is expected to handle. For example, a Web-based Integration Platform-as-a-Service (iPaaS) development tool speeds up the integration of on-premises services and third-party cloud-native services. Since SaaS providers require extensive integration with a cloud-native integration (iPaaS), the much-needed agility is improved.

Some of the advantages this provides include a simplified development approach, unlimited scalability through microservices, a higher resource utilisation thanks to IaaS and PaaS optimisation, and the availability of more dynamic reporting.

A hybrid architecture can also deliver several benefits to the insurer:
–Overall performance increase
–Agile, better-performing front ends
–Unlimited horizontal load dependant scaling
–Multi-channel API-first and API-led integration environment
–Frictionless mobile device support
-Integration with legacy databases
-Integration with a PaaS data warehouse
-Flexible deployment
–On-premises or in-cloud service integration
–Support of containers (Kubernetes or Openshift)
–DevOps support
–Fast prototyping and time-to-market

The Future is Now

By overhauling their core systems and infrastructure, insurers become better equipped for the requirements and expectations of a digital world. For instance, implementing personal voice assistant capabilities like Amazon Alexa or Google Home, and even leveraging artificial intelligence capabilities to deliver more customised customer solutions, all become not just a possibility, but a reality too.

The digitalisation journey will only accelerate as more businesses and industries wake up to the potential of the cloud. It is happening right now due to the surge of remote workers and the inability of salespeople to pop out and walk customers through their insurance portfolio changes. Even though insurers do face challenges in migrating to such a dynamic new environment, it is critical to begin as quickly as possible. The significant cost savings, customer advantages, and flexibility to scale according to business requirements make for too compelling an argument to ignore.

Let’s block ads! (Why?)

The TIBCO Blog

Read More

New benchmark measures gender bias in speech translation systems

June 12, 2020   Big Data
 New benchmark measures gender bias in speech translation systems

A preprint paper published by University of Trento researchers proposes a benchmark — MuST-SHE — to evaluate whether speech translation systems fed textual data are constrained by the fact that sentences sometimes omit gender identity clues. The coauthors assert that these systems can and do exhibit gender bias, and that signals beyond text (like audio) provide contextual clues that might reduce this bias.

In machine translation, gender bias is at least partially attributable to the differences in how languages express female and male gender. Those with a grammatical system of gender, such as Romance languages, rely on a copious set of inflection and gender agreement devices that apply to individual parts of speech. That’s untrue of English, for instance, which is a “natural gender” language — it reflects distinction of sex only via pronouns, inherently gendered words (e.g., “boy,” “girl”), and marked nouns (“actor,” “actress”).

AI translation systems that fail to pick up on the nuances threaten to perpetuate under- or misrepresentation of demographic groups. Motivated by this, the researchers created MuST-SHE, a multilingual test set designed to uncover gender bias in speech translation.

MuST-SHE is a subset of TED talks comprising roughly 1,000 audio recordings, transcripts, and translations in English-French and English-Italian pairs from the open source MuST-C corpus, annotated with qualitatively differentiated and balanced gender-related phenomena. It’s subdivided into two categories:

VB Transform 2020 Online – July 15-17. Join leading AI executives: Register for the free livestream.

  • Category 1: Samples where the necessary information to disambiguate gender can be recovered from the audio signal, when gender agreement depends only on the speaker’s gender.
  • Category 2: Samples where the disambiguating information can be recovered from the utterance content, where contextual hints such as gender-exclusive words (“mom”), pronouns (“she,” “his”), and proper nouns (“Paul”) inform about gender.

For each reference in the corpus, the researchers created a “wrong” one identical to the original except for the morphological signals conveying gender agreement. The result was a new set of references that are “wrong” compared with the correct ones in regard to the formal expression of gender, the idea being that the difference can be used to measure a speech recognition systems’ ability to handle gender phenomena.

In a series of experiments, the researchers created three speech recognition systems:

  • End2End, which was trained on the MuST-C and open source Librispeech data sets, augmented by automatically translating the original English transcripts into target languages.
  • Cascade, which shares the same core technology as End2End but which was trained on 70 million language pairs for English-Italian and 120 million for English-French from the OPUS repository, after which it was fine-tuned on MuST-C training data.
  • Cascade+Tag, a model identical to Cascade excepting tags added to the training data that indicate a speaker’s gender.

Interestingly, the researchers found that injecting gender information into Cascade didn’t have a measurable effect when evaluated on MuST-SHE. The difference values between the original and “wrong” references in the data set implied that all three systems were biased toward masculine forms.

When it came to the categories, Cascade performed the worst on Category 1 because it couldn’t access the speaker’s gender information it needed for a correct translation. End2End leveraged audio features to accurately translate gender, by contrast, but it exhibited the worst performance on Category 2 data — perhaps because it was trained on a fraction of the data used in Cascade and Cascade+Tag.

“If, like human beings, ‘machine learning is what it eats,’ the different ‘diet’ of machine translation and speech translation models can help them develop different skills,” wrote the researchers. “By ‘eating’ audio-text pairs, speech translation has a potential advantage: the possibility to infer speakers’ gender from input audio signals.”

The paper’s publication comes after Google introduced gender-specific translations in Google Translate chiefly to address gender bias. Scientists have proposed a range of approaches to mitigate and measure it, most recently with a leaderboard, challenge, and set of metrics dubbed StereoSet.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Why Millennials Prefer the Security of Microsoft Dynamics Over Other CRM Systems

May 16, 2020   CRM News and Info

With so many privacy scandals making headlines, the current workforce well understands the importance of security. Choosing a system that provides this protection for employees and customers is essential for Millennials, who make up the majority of modern employees.

Microsoft Dynamics is a leader in global security. Microsoft delivers layered security in all applications to allow workers to do their best work anywhere with full confidence. Consider briefly how Microsoft Dynamics provides the security needed for any company:

Security anywhere

Microsoft Dynamics provides physical and virtual security. These include access control, encryption, and authentication. This helps protect data on all devices whether it be mobile phones, tablets, or computers. Role-based security defines access to system data no matter where they are working.

Intelligent security

As security risks continue to rapidly grow, modern workers not only want but expect the systems they work with to be protected. Microsoft Dynamics meets these expectations by using billions of data points globally to engineer techniques and apply intelligence to progressively improve security.

Protect customer data

Employees today want to be confident that the data they collect from customers is secure and fully protected. Microsoft Dynamics keeps this data safe by preventing the disclosure of all personal and financial information. This makes it easy to maintain customer loyalty and comply with industry regulations.

Thomas Berndorfer, CEO of Connecting Software trusts Microsoft Dynamics to be completely secure in handling his company’s sensitive information. He says: “Dynamics 365 provides a robust data security model, and add-on products ensure data protection between D365 and other Microsoft apps.”

Would you like to see how they use Microsoft Dynamics’ security could benefit your business?

Read this and 18 other reasons why Millennials prefer Microsoft Dynamics in the workspace by downloading the full eBook “21 Reasons Millennials Prefer Microsoft Dynamics” at www.crmsoftwareblog.com/millennials to read 17 more reasons why Millennials prefer Microsoft Dynamics in the workspace.

Find a Microsoft Dynamics 365 Partner

By CRM Software Blog Writer, www.crmsoftwareblog.com

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Microsoft makes Windows ML standalone to support older systems

March 19, 2020   Big Data

Microsoft today announced that Windows ML, the API for running machine learning inferences on Windows devices, will soon make its way to more places. Going forward, it’ll be available as a standalone package that can be shipped with any Windows app, enabling Windows ML support for CPU inference on Windows versions 8.1 and newer and GPU hardware-acceleration on Windows 10 1709 and newer.

That should make it easier for developers to ship AI-imbued Windows apps with feature parity. As for business and consumer users of those apps, the change should translate to improved in-app experiences.

Previously, Windows ML was supported as a built-in Windows component on Windows 10 version 1809 (October 2018 Update) and newer. Microsoft says it’ll continue to update the API with each new version of Windows, but that in the future, there will be a corresponding redistributable Windows ML package with matching new features and optimizations.

“We understand the complexities developers face in building applications that offer a great customer experience, while also reaching their wide customer base,” wrote Windows AI platform senior program manager Nick Geisler in a blog post. “Delivering reliable, high-performance results across the breadth of Windows hardware, Windows ML is designed to make ML deployment easier, allowing developers to focus on creating innovative applications.”

Windows ML lets developers use trained machine learning models in Windows apps that are written in C#, C++, JavaScript, or Python, either locally on a Windows device or on a Windows Server machine. The Windows ML runtime evaluates the trained model and handles the hardware abstraction, allowing developers to target a broad range of silicon including CPUs, graphics cards, and even AI accelerators. This acceleration is built on top of DirectML, a high-performance, low-level API for running ML inferences that is a part of Microsoft’s DirectX family.

 Microsoft makes Windows ML standalone to support older systems

Roughly a year into its release, Windows ML has made its way into a number of popular Windows apps. Windows Photos taps Windows ML to help organize photo collections, while Windows Ink leverages it to analyze handwriting, converting ink strokes into text, shapes, lists and more. And Adobe Premiere Pro offers a Windows ML-powered feature that takes videos and crops them to any aspect ratio, all while preserving the important action in each frame.

Microsoft also today revealed its plans to unify its approach with Windows ML, ONNX Runtime, and DirectML. Specifically, it will bring the Windows ML API and a DirectML execution provider to the ONNX Runtime GitHub project, so that developers can choose the API set that works best for their app. (The ONNX Runtime is an inference engine for the Open Neural Network Exchange, which aims to provide machine learning framework interoperability.) The Windows ML and DirectML preview is available as source as of this week, with instructions and samples on how to build it as well as a prebuilt package for CPU deployments.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Microsoft Partners and CSPs Need Automated Billing Systems – Why Excel Isn’t Enough

November 15, 2019   Microsoft Dynamics CRM

If you’re in the business of generating recurring revenue like Microsoft CSP Partners and using Excel to handle your customer’s subscriptions and billing, you’re not alone. Dynamics 365 can easily be used to manage your sales and service, and with Work 365 you can manage recurring revenue and Billing automation for subscription sales. While Partner Center and Distributor portals are available, most Service providers still use spreadsheets to log and record the changes that are being made to subscriptions and services. These changes along with a combination of all the different invoices are used to invoice and bill a customer.

This process can take hours or even days for teams of sales and accounting members to manually go through customer accounts and match up customer data in Dynamics CRM with the appropriate subscription changes in order to manually create and send out invoices. Most importantly this process is not scalable.

automated billing 625x192 Microsoft Partners and CSPs Need Automated Billing Systems – Why Excel Isn’t Enough

Automating this process can save a service provider hundreds of employee hours that can be used to focus on value creation tasks around customer service and sales to grow the business. Billing automation solutions like Work 365, that are built on Dynamics 365 allow Microsoft Partners to grow their recurring revenue with these great features around billing automation:

  • Bundling products and services into custom packages
  • Automatically Creating invoices by calculating prorations, refunds, and renewals
  • Automatically Sending and Collecting payment for invoices through
  • Calculating monthly Sales commissions and incentives
  • Integrating with accounting systems
  • Enabling Self-Service scenarios for customers to directly provision and manage their services

Work 365 is built on Dynamics 365 and has bi-synchronization with Partner Center, all subscription changes, billing, and payment status, and invoicing can be done directly from Work 365 with a click of a button, saving Partners and CSPS time and using multiple systems to keep their records in check.

Watch our webinars to see how Work 365 can help Indirect and Direct CSPs automate and reconcile billing and invoicing.

ismail nalwala iotap 150x150 Microsoft Partners and CSPs Need Automated Billing Systems – Why Excel Isn’t Enough

By Ismail Nalwala

I am a Dynamics 365 enthusiast. I enjoy building systems and working with cross-functional teams to solve problems and build processes from lead generation to cash collection. Work 365 is a global developer of the Billing Automation and subscription application for Dynamics. Helping companies to streamline business processes and scale their recurring revenue.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Shopify acquires 6 River Systems for $450 million to expand its AI-powered fulfillment network

September 10, 2019   Big Data
 Shopify acquires 6 River Systems for $450 million to expand its AI powered fulfillment network

Shopify today announced it is acquiring 6 River Systems, a startup focused on fulfillment automation for e-commerce and retail operations. The deal is valued at approximately $ 450 million — 60% in cash and 40% in shares.

In June at its Unite partner and developer conference, Shopify announced the Shopify Fulfillment Network, which uses machine learning to ensure timely deliveries and lower shipping costs. Shopify’s fulfillment centers span California, Georgia, New Jersey, Nevada, Ohio, Pennsylvania, and Texas. The network, which is only available in early access, supports merchants that ship between 10 to 10,000 packages per day. The company hopes to eventually support between 3 to 30,000 packages per day. The 6 River Systems acquisition looks like an attempt to speed up that growth.

“Shopify is taking on fulfillment the same way we’ve approached other commerce challenges, by bringing together the best technology to help everyone compete,” Shopify CEO Tobi Lütke said in a statement. “With 6 River Systems, we will bring technology and operational efficiencies to companies of all sizes around the world.”

Robots in the warehouse

6 River Systems is best known for its Chuck autonomous vehicles that can move around packages in warehouses. Indeed, Shopify believes that adding the robots to its fulfillment network “will increase the speed and reliability of warehouse operations, by empowering on-site associates with daily tasks, including inventory replenishment, picking, sorting, and packing.”

While Shopify plans to accelerate the growth of its AI-powered fulfillment network, it promises to keep selling 6 River Systems’ solution for warehouses. The startup’s solution operates in more than 20 facilities in the U.S., Canada, and Europe. It fulfills millions of units each week for companies including Lockheed Martin, CSAT Solutions, ACT Fulfillment, DHL, XPO Logistics, and Office Depot.

Shopify doesn’t expect the transaction to have any material effect on its revenue in 2019. It will increase the company’s expenses for 2019 by $ 25 million, however, including $ 10 million in operating expenses, $ 8 million in amortization of intangible assets, and $ 7 million in stock-based compensation. The company does expect 6 River Systems to generate annual billings of approximately $ 30 million in 2020.

Founded in 2015 by Jerome Dubois, Rylan Hamilton, and Chris Cacioppo, 6 River Systems is based in Waltham, Massachusetts. 6 River Systems had raised $ 46 million to date from Eclipse Ventures, Menlo Ventures, Norwest Venture Partners, and iRobot. Dubois and Hamilton were previously executives at Kiva Systems (Amazon acquired Kiva Systems for $ 775 million in March 2012). Shopify specifically called that out in its release, noting that the acquisition will give it “experienced leaders from Kiva Systems (now Amazon Robotics).”

The deal has already been approved by 6 River Systems’ stockholders. Included in the $ 450 million amount is approximately $ 69 million for the startup’s founders and employees “that will vest subject to certain conditions and will be treated as stock-based compensation.” The acquisition is expected to close in Q4 2019.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Teradata Hires Bob Joyce for New Role of EVP, Teradata Business Systems

August 31, 2019   BI News and Info
teradata logo social Teradata Hires Bob Joyce for New Role of EVP, Teradata Business Systems

New executive brings extensive experience in operational excellence – joins newly hired CRO, Scott Brown, and CHRO, Kathy Cullen-Cote, to strengthen leadership team and boost Teradata’s transformation

Teradata (NYSE: TDC), the industry’s only Cloud First Pervasive Data Intelligence company, today announced that it has appointed Bob Joyce as Executive Vice President of Teradata Business Systems, reporting directly to CEO Oliver Ratzesberger, effective immediately. Joyce will lead the newly created Teradata Business Systems function, which is focused exclusively on driving operational excellence throughout the company. Teradata has been steadily advancing in its business transformation, and with the addition of Joyce, as well as Teradata’s CRO Scott Brown and CHRO Kathy Cullen-Cote, the Teradata executive team has the right talent to deliver on all aspects of the company’s cloud first strategy and delight customers.  
 
“In Bob, we are adding a leader for Teradata Business Systems with fundamental experience driving an organization to be lean, efficient and agile, with a mindset for continuous improvement and growth” said Oliver Ratzesberger, President and CEO at Teradata. “This hire, and the establishment of a world-class business system, is absolutely critical to our company strategy. Add in the recent hires of Scott and Kathy, Teradata is now extremely well positioned to continue our leadership in this market and help the world’s leading companies leverage cloud to get answers to their toughest challenges.”
 
Teradata Business Systems, a new business function within Teradata, will drive operational excellence by ensuring everything Teradata does is aligned to its strategy, data-driven and focused on key metrics that increase the value of the company. Improving operational excellence strengthens the company’s commitment to its customers, improves the ease of doing business and will enhance Teradata in the position of true trusted advisor.
 
“Teradata is in the midst of a truly bold transformation to a cloud first subscription model and I’m excited to contribute,” said Bob Joyce, EVP Teradata Business Systems. “The market landscape is changing everyday and I believe that the companies – like Teradata – who focus on and invest in continuous improvement will have a significant competitive advantage. Tomorrow’s leaders will cultivate a growth mindset using simplicity and lean principles. This is what I will contribute to Teradata in support of serving our customers with the agility and flexibility they require.”
 
About Bob Joyce
Joyce joins Teradata from Fortive Corporation, where he was Group President over five operating companies in the United States, Europe and China. Fortive was a spin out of Danaher, the pioneer in driving growth through a measured business system. Joyce was fundamental in developing the Danaher Business System, the world-leader in helping companies track execution and create options for doing better. As part of this effort, Joyce has run billion-dollar businesses, driving consistent double-digit growth, and has successfully led organizations through significant improvements in execution, delivery and quality. Joyce is reporting to CEO Oliver Ratzesberger and will be based in San Diego.
 

Let’s block ads! (Why?)

Teradata United States

Read More

A Systems Approach To Innovation: From Guessing Correctly, To Learning Quickly

August 27, 2019   SAP
 A Systems Approach To Innovation: From Guessing Correctly, To Learning Quickly

When we think of innovation, many of us envision a team of experts conducting projects in a systematic manner – following defined steps to track progress, determine outcomes, and plan next steps to design a new technology.

This lends itself very well to very well-defined problems. Typically, the team follows the engineering method, moving from idea, to concept, planning, design, development, and implementation to solve these problems. However, issues often arise when “things don’t go according to plan!” – when the outcome isn’t correct or foreseen – and lead to a project stalling or failing altogether.

In the case of undefined or very complex problems, as is almost always the case in business or social issues, this linear approach presents several critical issues. In these domains, most opportunities and challenges emerge from the behavior of a system over time, be it economic, social, physical, or technological.

A system, at its most basic, is the interaction of various discrete components or parts. It’s easy enough to visualize and consider each of these parts in isolation; however, we almost always underestimate the difficulty in predicting the outcomes and behavior of the system during the design stage.

For example, in the case of plastics pollution, key material technology developments in plastics allowed society to address economic issues, such as food availability (by increasing shelf life) and reducing the need for sterilization of certain medical equipment (due to plastic health consumables). However, plastic became too much of a good thing, with, for example, UV-stabilized plastic being largely impervious to breaking down in the natural environment.

Another sustainability initiative aimed to address worsening pollution issues by limiting the use of cars on certain days. Cars with license plates ending in even digits could be driven on certain days of the week, and those ending in odd numbers could be driven on the other days. Since people still needed to commute, they started buying old, second-hand vehicles (typically with worse carbon and pollution emissions) with different license plate numbers to use on the alternate days of the week – increasing, rather than decreasing, pollution.

While the engineering and business fields have developed several methods (including root-cause analysis, scenario planning, etc.) to try to understand and address these emerging issues, they still rely heavily on teams calculating or forecasting outcomes of systems. However, even systems with as few as three elements create outcomes that cannot be calculated or predicted. In some cases, existing technology solutions can reduce execution risk in many innovation initiatives.

Typical problem-solving techniques, however, require that we correctly guess the outcome of a system’s design or change. From an innovation standpoint, it is more effective to borrow from science and conduct experiments on or within the system to validate ideas cheaply and quickly (learn fast!) to determine what the true outcomes will be.

The mantra of “fail fast” is well known in the innovation community; however, the focus is often on rationalizing failure, even though it conveys that gaining knowledge through quick (ideally cheap!) tests can lead to more informed stakeholders and eventually successful outcomes.

Many areas of engineering, particularly in the software field, look at addressing problems in an empirical manner using techniques such as agile and test-driven development. These development teams don’t need to correctly guess the outcome of a certain stage to plan an entire project from start to finish; rather they can quickly learn the outcome, which will inform their R&D work in an iterative manner.

When considering how to approach different problems, key systems-thinking concepts need to be applied – considering that the impact of feedback, human psychology, and system changes can lead to counterintuitive solutions that address a challenge more comprehensively.

A classic example is the now pervasive “Just-In-Time” manufacturing method. When first proposed in the 1980s, business experts questioned this new method as being incredibly counterintuitive. It created a major risk that component stockouts would bring operations to a stand-still since inventory on hand was kept to a minimum. This new methodology, however, reduced cash tied up in inventory, contributing to better business performance.

It also allowed businesses to become increasingly agile in meeting ever-changing customer demand by allowing product customization, and it increased product ranges, as manufacturers weren’t limited to only what they had stockpiled. Using information technology to further reduce the stock-out risk, the Just-In-Time method has revolutionized manufacturing.

SAP Innovation Services helps companies navigate, understand, and address their innovation opportunities and challenges with methodologies to ensure ideas are rapidly validated throughout development, and ultimately able to be delivered at scale. Find out more at SAP Innovation Services.

Let’s block ads! (Why?)

Digitalist Magazine

Read More
« Older posts
  • Recent Posts

    • Rickey Smiley To Host 22nd Annual Super Bowl Gospel Celebration On BET
    • Kili Technology unveils data annotation platform to improve AI, raises $7 million
    • P3 Jobs: Time to Come Home?
    • NOW, THIS IS WHAT I CALL AVANTE-GARDE!
    • Why the open banking movement is gaining momentum (VB Live)
  • Categories

  • Archives

    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited