• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Cloud

Teradata Named a Leader in Cloud Data Warehouse Evaluation by Independent Research Firm

April 11, 2021   BI News and Info
forrester wave teradata is named leader.png?width=640&height=336&ext= Teradata Named a Leader in Cloud Data Warehouse Evaluation by Independent Research Firm

Teradata (NYSE: TDC), a leading multi-cloud data warehouse platform provider, today announced that Forrester Research has named Teradata a Leader in “The Forrester Wave™: Cloud Data Warehouse, Q1 2021,” written by VP and Principal analyst Noel Yuhanna, on March 24, 2021. Forrester analyzed and scored the top 13 vendors in the Cloud Data Warehouse market according to 26 criteria.

“Over the past 12-18 months, Teradata has been laser-focused on our cloud capabilities and the performance of our cloud business,” said Steve McMillan, CEO of Teradata. “From shoring up our cloud credentials with key new executive appointments to significantly increasing our cloud R&D spend, our commitment to and investment in the cloud has successfully positioned Teradata as a modern, relevant cloud platform for our customers. We believe that this recognition from Forrester is another validation that our cloud-first agenda is winning in the market.”

Forrester’s analysis of Teradata in the Cloud Data Warehouse Wave evaluation is based on Vantage — a multi-cloud data warehouse platform that enables ecosystem simplification by connecting analytics, data lakes, and data warehouses. 

According to Forrester’s evaluation, “[Vantage] combines open source and commercial technologies to operationalize insights; solve business problems; enable descriptive, predictive, and prescriptive analytics; and deliver performance for mixed workloads with high query concurrency using workload management and adaptive optimization. Teradata Vantage integrates multiple analytic languages – including SQL, R, Python, SAS, and Java – and supports various data types, including JSON, Avro, Parquet, relational, spatial, and temporal.”

The Forrester report also notes that “Customers like Teradata Vantage’s hybrid cloud platform, reliability, data science, advanced analytics, and ease of management from an infrastructure perspective. Top use cases include BI acceleration, customer intelligence, real-time analytics, embedded data science functions, fraud detection, time-series analysis, data lake integration, data warehouse modernization, and data services.”

Read the complete The Forrester Wave™: Cloud Data Warehouse, Q1 2021 report here. 

With Vantage, enterprise-scale companies can eliminate silos and cost-effectively query all their data all the time. Regardless of where the data resides – in the cloud using low-cost object stores, on multiple clouds, on-premises, or any combination thereof – organizations can get a complete view of their business. And by combining Vantage with first-party cloud services, Teradata enables customers to expand their cloud ecosystem with deep integration of cloud-specific, cloud-native services.

Let’s block ads! (Why?)

Teradata United States

Read More

FanGraphs’ advanced baseball analytics has a new cloud home: MariaDB

April 1, 2021   Big Data
 FanGraphs’ advanced baseball analytics has a new cloud home: MariaDB

Want more company growth? Build a better company culture.

Learn how building an employee-first culture determines resilience, growth, and innovation.

Register Now


Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.


With the 2021 Major League Baseball season opening today, fans will be filling out their scorecards as they return to stadiums for the first time since the COVID-19 pandemic took hold last spring.

Of course, the data that is now regularly made available by the MLB goes well beyond the hits, runs, and errors fans typically record in a scorecard they purchase at a game. MLB has made the Statcast tool available since 2015. It analyzes player movements and athletic abilities. The Hawk-Eye service uses cameras installed at ballparks to provide access to instant video replays.

Fans now regularly consult a raft of online sites that uses this data to analyze almost every aspect of baseball: top pitching prospects, players who hit the most consistently in a particular ballpark during a specific time of day, and so on.

One of those sites is FanGraphs, which has transitioned the SQL relational database platform it relies on to process and analyze structured data to a curated instance of the open source MariaDB database that has been deployed on the Google Cloud Platform (GCP) as part of a MariaDB Sky cloud service.

MariaDB provides IT organizations with an alternative to the open source MySQL database Oracle gained control over when it acquired Sun Microsystems in 2009. MariaDB is a fork of the MySQL database that is now managed under the auspices of a MariaDB Foundation that counts Microsoft, Alibaba, Tencent, ServiceNow, and IBM among its sponsors, alongside MariaDB itself.

FanGraphs uses the data it collects to enable its editorial teams to deliver articles and podcasts that project, for example, playoff odds for a team based on the results of the SQL queries the company crafts. These insights might be of particular interest to a baseball fan participating in a fantasy league, someone who wants to place a more informed wager on a game at a venue where gambling is, hopefully, legalized, or those making baseball video games.

The decision to move from MySQL to MariaDB running on GCP was made after a few false starts involving attempts to lift and shift the company’s MySQL database instance into the cloud, FanGraphs CEO David Appelman said.

One of the things that attracted FanGraphs to MariaDB is the level of performance that it could attain using a database-as-a-service (DBaaS) platform based on MariaDB and that it provides access to a columnstore storage engine that might one day be employed to drive additional analytics, Appelman said.

In addition, MariaDB now manages the underlying database FanGraphs uses. Appleman said he previously handled most of the IT functions for FanGraphs, including the crafting of SQL queries. Now he will have more time to create SQL queries and monitor the impact they have on the performance of the overall database, Appelman said. “I like to see where the bottlenecks created by a SQL query are,” he added.

FanGraphs plans to eventually take advantage of the data warehouse service provided by MariaDB, Appelman noted.

It’s not likely any of the analytics capabilities provided by FanGraphs and similar sites will one day be able to predict which baseball team will win on any given day. However, the insights they surface do serve to make the current generation of baseball fans a lot more informed about the nuances of the game than Abner Doubleday probably could have imagined.

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Is your Dynamics On Premise implementation cloud ready?

March 31, 2021   CRM News and Info

2915853144 Is your Dynamics On Premise implementation cloud ready?

Lots of customers have opted to implement Microsoft Dynamics 365 On Premise rather than in the cloud over the years. We’ve helped several migrate to online as the choice became part of their technology roadmaps. In recent years, Microsoft has moved towards a “Cloud First” methodology. Review our recent blog post “The Benefits of Dynamics 365 Online Versus On-Premise” to learn more about the benefits of Dynamics 365 Online. For those still on premise, this means it’s time to start planning ahead. You may be wondering, “Is my Dynamics On Premise implementation cloud ready?” or “When should I start planning to migrate to the cloud?” Have no fear. Beringer can help!

How do I get to the cloud?

Here are Beringer Technology Group, we have helped several customers migrate Dynamics On Premise implementations to the cloud. We’ll go through a rigorous review process with you to identify data to be migrated, reports, workflows and business processes currently in use, what works well and what is lacking in order to create a roadmap for your online migration path. No two are alike. We’ll look at customizations and make personalized recommendations for eliminating code and utilizing new features and functionality available only in the cloud. We’ll help you make decisions to leverage the Unified Interface as well as plan for enhancements Microsoft pushes out with each new “wave” or release. We can deliver a training program to help your organization through the transition as well. We’ll work together to make the move to the cloud as smooth as possible.

When is the best time to move?

So when is the right time to move to the cloud? In short, the sooner, the better. The Mainstream Support end date for Dynamics 365 for Customer Engagement v9.0 is 1/9/2024. Mainstream Support includes new releases, updates, service packs, builds, fixes and patches needed to fix product defects or security vulnerabilities.  The Extended Support end date for that version is 1/13/2026, which oddly enough is the same end date for CRM 2016. Extended Support usually last for another five years after Mainstream Support ends. The much shorter two year period here leads us to believe that Microsoft has other plans for Dynamics 365 On Premise. During this period, Microsoft will continue to provide security updates and bug fixes. But no other updates will be available without a paid support agreement.

To keep your Dynamics 365 application safe, secure and up to date, we recommend migrating to the cloud before the end of Mainstream Support for your current version. Is your Dynamics On Premise implementation cloud ready? Contact us today to find out more about how Beringer can help you take the next step to move to the cloud!

Beringer Technology Group, a leading Microsoft Gold Certified Partner specializing in Microsoft Dynamics 365 and CRM for Distribution, also provides expert Managed IT Services, Backup and Disaster Recovery, Cloud Based Computing, Email Security Implementation and Training,  Unified Communication Solutions, and Cybersecurity Risk Assessment.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Cere Network raises $5 million to create decentralized data cloud platform

March 29, 2021   Big Data

From TikTok to Instagram, how’s your creative working for you?

In digital marketing, there’s no one-size-fits-all. Learn how data can make or break the performance of creative across all platforms.

Register Now


Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.


Cere Network has raised $ 5 million for its decentralized data cloud (DDC) platform, which is launching today for developers. The company’s ambition is to take on data cloud leader Snowflake.

The investment was led by Republic Labs, the investment arm of crowdsourced funding platform Republic. Other investors include Woodstock Fund, JRR Capital, Ledger Prime, G1 Ventures, ZB exchange, and Gate.io exchange. Cere Network previously raised $ 5 million from Binance Labs and Arrington XRP Capital, amongst others, bringing its total raised to $ 10 million.

“Enterprises using Snowflake are still constrained by bureaucratic data acquisition processes, complex and insufficient cloud security practices, and poor AI/ML governance,” Cere Network CEO Fred Jin said in an email to VentureBeat. “Cere’s technology allows more data agility and data interoperability across different datasets and partners, which extracts more value from the data faster compared to traditional compartmentalized setup.”

The Cere DDC platform launches to developers today, which allows thousands of data queries to be hosted on the blockchain, the transparent and secure digital ledger.

The platform offers a more secure first-party data foundation in the cloud by using blockchain identity and data encryption to onboard and segment individual consumer data. This data is then automated into highly customizable and interoperable virtual datasets, directly accessible in near real time by all business units, partners/vendors, and machine-learning processes.

Above: Cere Network’s data cloud query.

Image Credit: Cere Network

The Cere token will be used to power its decentralized data cloud and fuel Cere’s open data marketplace that allows for trustless data-sharing among businesses and external data specialists, as well as staking and governance. The public sale of the Cere token will be held on Republic, the first token sale on the platform.

“We’ve been following Cere Network for some time and have been impressed with the team and the market fit – and need – for a decentralized data cloud,” said Boris Revsin, managing director of Republic Labs, in a statement. “We’re very excited to host Cere Network’s token sale on Republic, which will ensure a decentralized network and faster adoption in the enterprise space of blockchain technology. Their DDC improves upon Snowflake using blockchain identity and data encryption to onboard and segment individual consumer data.”

Developers can access the Cere DDC here. The public sale for Cere token is scheduled for March 31 on Republic. The company said it is working with a number of Fortune 1,000 customers.

“There’s a huge amount of opportunities in this rapidly shifting space for the coming years. We don’t plan to take on the likes of Snowflake head on, yet, but rather focus on specific solutions and verticals where we can bring more customization and efficiency. We are ok with chipping away at their lead while doing this,” Jin said. “We are bringing an open data marketplace which will open up data access beyond the limitation of traditional silo’d data ecosystems, which include Snowflake, and the likes of Salesforce.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

IBM’s Rob Thomas details key AI trends in shift to hybrid cloud

March 19, 2021   Big Data

Is AI getting used responsibly in health care?

AI is poised to impact health care dramatically, but how do you ensure it’s used equitably across all populations? Learn what’s needed.

Register Now


Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.


The last year has seen a major spike in the adoption of AI models in production environments, in part driven by the need to drive digital business transformation initiatives. While it’s still early days as far as AI is concerned, it’s also clear AI in the enterprise is entering a new phase.

Rob Thomas, senior vice president for software, cloud, and data platform at IBM, explains to VentureBeat how this next era of AI will evolve as hybrid cloud computing becomes the new norm in the enterprise.

As part of that effort, Thomas reveals IBM has formed a software-defined networking group to extend AI all the way out to edge computing platforms.

This interview has been edited for brevity and clarity.

VentureBeat: Before the COVID-19 pandemic hit, there was a concern AI adoption was occurring slowly. How much has that changed in the past year?

Rob Thomas: We’ve certainly got massive acceleration for things like Watson Assistant for customer service. That absolutely exploded. We had nearly 100 customers that started and then went live in the first 90 days after COVID hit. When you broaden it out, there are five big use cases that have come up over the last year. One is customer service. Second is around financial planning and budgeting. Thirdly are things such as data science. There’s such a shortage of data science skills, but that is slowly changing. Fourth is around compliance. Regulatory compliance is only increasing, not decreasing. And then fifth is AI Ops. We launched our first AI ops product last June and that’s exploded as well, which is related to COVID in that everybody was forced remote. How do we better manage our IT systems? It can’t be all through humans because we’re not on site. We’ve got to use software to do that. I think that was 18 months ago, I wouldn’t have given you those five. I would have said “There’s a bunch of experimentations.” Now we see pretty clearly there are five things people are doing that represent 80% of the activity.

VentureBeat: Should organizations be in the business of building AI or should they buy it in one form or another?

Thomas: I hate to be too dramatic, but we’re probably in a permanent and a secular change where people want to build. Trying to fight that is a tough discussion because people really want to build. When we first started with Watson, the idea was this is a big platform. It does everything you need. I think what we’ve discovered along the way is if you componentize to focus where we think we’re really good, people will pick up those pieces and use them. We focused on three areas for AI. One is natural language processing (NLP). I think if you look at things like external benchmarks, we had the best NLP from a business context. In terms of document understanding, semantic parsing of text, we do that really well. The second is automation. We’ve got really good models for how you automate business processes. Third is trust. I don’t really think anybody is going to invest to build a data lineage model, explainability model, or bias detection. Why would a company build that? That’s a component we can provide. If you want them to be regulatory compliant, you want them to have explainability, then we provide a good answer for that.

VentureBeat: Do you think people understand explainability and the importance of the provenance of AI models and the importance of that yet? Are they just kind of blowing by that issue in the wake of the pandemic?

Thomas: We launched the first version of what we built to address that around that two years ago. I would say that for the first year we got a lot of social credit. This changed dramatically in the second half of last year. We won some significant deals that were specifically for model management explainability and lifecycle management of AI because companies have grown to the point where they have thousands of AI models. It’s pretty clear, once you get to that scale, you have no choice but to do this, so I actually think this is about to explode. I think the tipping point is once you get north of a thousandish models in production. At that point, it’s kind of like nobody’s minding the store. Somebody has to be in charge when you have that much machine learning making decisions. I think the second half of last year will prove to be a tipping point.

Above: IBM senior VP of software, cloud, and data Rob Thomas

VentureBeat: Historically, AI models have been trained mainly in the cloud, and then inference engines are employed to push AI out to where it’d be consumed. As edge computing evolves, there will be a need to push the training of AI models out to the edge where data is being analyzed at the point of creation and consumption. Is that the next AI frontier?

Thomas: I think it’s inevitable AI is gonna happen where the data is because it’s not economical to do the opposite, which is to start everything with a Big Data movement. Now, we haven’t really launched this formally, but two months ago I started a unit in IBM software focused on software-defined networking (SDN) and the edge. I think it’s going to be a long-term trend where we need to be able to do analytics, AI, and machine learning (ML) at the edge. We’ve actually created a unit to go after that specifically.

VentureBeat: Didn’t IBM sell an SDN group to Cisco a long time ago now?

Thomas: Everything that we sold in the ’90s was hardware-based networking. My view is everything that’s done in hardware from a networking at the edge perspective is going to be done in software in the next five to seven years. That’s what’s different now.

VentureBeat: What differentiates IBM when it comes to AI most these days?

Thomas: There are three major trends that we see happening in the market. One is around decentralization of IT. We went from mainframes that are centralized to client/server and mobile. The initial chapter of public cloud was very much a return to a centralized architecture that brings everything to one place. We are now riding the trend that says that we will decentralize again in the world that will become much more about multicloud and hybrid cloud.

The second is around automation. How do you automate feature engineering and data science? We’ve done a lot in the realm of automation. The third is just around getting more value out of data. There was this IDC study last year that 90% of the data in businesses is still unutilized or underutilized. Let’s be honest. We haven’t really cracked that problem yet. I’d say those are the three megatrends that we’re investing against. How does that manifest in the IBM strategy? In three ways. One is we are building all of our software on open source. That was not the case two years ago. Now, in conjunction with the Red Hat acquisition, we think there’s room in the market for innovation in and around open source. You see the cloud providers trying to effectively pirate open source rather than contribute. Everything we’re doing from a software perspective is now either open source itself or it’s built on open source.

The second is around ecosystem. For many years we thought we could do it ourselves. One of the biggest changes we’ve made in conjunction with the move to open source is we’re going to do half of our business by making partners successful. That’s a big change. That why you see things like the announcement with Palantir. I think most people were surprised. That’s probably not something we would have done two years ago. It’s kind of an acknowledgment that all the best innovation doesn’t have to come from IBM. If we can work with partners that have a similar philosophy in terms of open source, that’s what we’re doing.

The third is a little bit more tactical. We announced earlier this year that we’ve completely changed our go-to-market strategy, which is to be much more technical. That’s what we’ve heard customers want. They don’t want a salesperson to come in and read them the website. They want somebody to roll up their sleeves and actually build something and co-create.

VentureBeat: How do you size up the competitive landscape?

Thomas: Watson components can run anywhere. The real question is why is nobody else enabling their AI to run anywhere? IBM is the only company doing that. My thesis is that most of the other big AI players have a strategy tax. If your whole strategy is to bring everything to our cloud, the last thing you want to do is enable your AI to run other places because then you’re acknowledging that other places exist. That’s a strategy advantage for us. We’re the only ones that can truly say you can bring the AI to where the data is. I think that’s going to give us a lot of momentum. We don’t have to be the biggest compute provider, but we do have to make it incredibly easy for companies to work across cloud environments. I think that’s a pretty good bet.

VentureBeat. Today there is a lot of talk about MLOps, and we already have DevOps and traditional IT operations. Will all that converge one day or will we continue to need a small army of specialists?

Thomas: That’s a little tough to predict. I think the reason we’ve gotten a lot of momentum with AI Ops is because we took the stuff that was really hard in terms of data virtualization, model management, model creation, and automated 60-70% of that. That’s hard. I think it’s going to be harder than ever to automate 100%. I do think people will get a lot more efficient as they get more models in production. You need to manage those in an automated fashion versus a manual fashion, but I think it’s a little tough to predict that at this stage.

VentureBeat: There’re a lot of different AI engines. IBM has partnered with Salesforce. Will we see more of that type of collaboration? Will the AI experience become more federated?

Thomas: I think that’s right. Let’s look at what we did with Palantir. Most people thought of Palantir as an AI company. Obviously, they associate Watson with AI. Palantir does something really good, which is a low-code, no-code environment so that the data science team doesn’t have to be an expert. What they don’t have is an environment for the data scientist that does want to go build models. They don’t have a data catalog. If you put those two together, suddenly you’ve got an AI system that’s really designed for a business. It’s got low code, no code, it’s got Python, it’s got data virtualization, a data catalog. Customers can use that joint stack from us and will be better off than had they chosen one or the other and then tried to fix the things themselves. I think you’ll probably see more partnerships over time. We’re really looking for partnerships that are complementary to what we’re doing.

VentureBeat: If organizations are each building AI models to optimize specific processes in their favor, will this devolve into competing AI models simply warring with one another?

Thomas: I don’t know if it’ll be that straightforward. Two companies are typically using very different datasets. Now maybe they’re both joining with an external dataset that’s common, but whatever they have is first-party data or third-party data that is probably unique to them. I think you get different flavors, as opposed to two things that are conflicting or head to head. I think there’s a little bit more nuance there.

VentureBeat: Do you think we’ll keep calling it AI? Or will we get to a point where we just kind of realize that it’s a combination of algorithms and statistics and math [but we] don’t have to necessarily call it AI?

Thomas: I think the term will continue for a while because there is a difference between a rules-based system and a true learning machine that gets better over time as you feed it more data. There is a real distinction.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Enterprises share the challenges of migrating legacy systems to the cloud

March 18, 2021   Big Data
 Enterprises share the challenges of migrating legacy systems to the cloud

The power of audio

From podcasts to Clubhouse, branded audio is more important than ever. Learn how brands are increasing customer loyalty and personalization with these best practices.

Register Now


Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.


Enterprises are hopeful to achieve the many business, technical, and financial benefits of moving legacy systems to the cloud. However, the road to migrating and running these systems on cloud infrastructure can be rough for companies that fail to plan ahead. That’s the takeaway from a 2020 report published by enterprise technology consultancy Lemongrass, which in partnership with Upwave surveyed over 150 IT leaders with budgets of at least $ 1 million annually on the goals and obstacles in moving systems onto the cloud.

The global public cloud computing market is set to exceed $ 362 billion in 2022, according to Statista. (In 2018 alone, Amazon Web Services earned $ 26 billion in revenue for parent company Amazon.) IDG reports that the average cloud budget is up from $ 1.62 million in 2016 to a whopping $ 2.2 million today. But cloud adoption continues to present challenges for enterprises of any size. A separate Statista survey identified security, managing cloud spend, governance, and lack of resources and expertise as significant barriers to adoption.

A Lemongrass survey found that IT leaders were motivated to migrate systems by desires to secure data, maintain data access, save money, optimize storage resources, and accelerate digital transformation. IT management systems were the most likely legacy applications to move to the cloud, while security and ecommerce were the second-most and third-most likely, respectively.

But completing these migrations often isn’t easy — particularly as they’re primed to accelerate in a post-pandemic world.

Security and compliance were listed by 59% of IT leaders as the top challenge facing enterprises when moving legacy systems to the cloud, while 43% of respondents said the migrations took too long. Thirty-eight percent said costs were too high and 33% said a lack of in-house skills was the top complicating factor.

Cloud migrations can cost between $ 100,000 and $ 250,000 and rarely come in under budget, IT leaders told Lemongrass. Sixty-eight percent of respondents said it was hard to find people with the proper skills and 40% said migrations took at least seven months to complete.

IT leaders said that the top lessons they learned while migrating systems to the cloud were:

  1. Dedicate sufficient financial resources
  2. Allow for sufficient time
  3. Ensure there’s the right people and skills in-house
  4. Ensure the outlined business goals are achieved

Successful migrations to the cloud can lead to a wealth of benefits, surveys show. For example, according to OpsRamp, the average savings from cloud migration come to around 15% on all IT spending. Small and medium businesses benefit the most, as they spend 36% less money on IT that way. Moreover, 59% of companies report an increase in productivity after migrating apps and service to the cloud, Microsoft says.

“The survey findings are very consistent with feedback we receive from our customers,” Lemongrass CTO Vince Lubsey said. “Enterprises … understand there are challenges but the benefits far outweigh the obstacles. The key to success is following best practices, proper training and time management. It also helps to have the guidance of an experienced partner to create the required cloud operating model.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Teradata Adds Cloud Talent, Appoints Barry Russell as SVP and GM, Cloud

March 12, 2021   BI News and Info

New executive strengthens cloud expertise and capabilities, accelerating Teradata’s cloud strategy, momentum and growth

Teradata (NYSE: TDC), a leading multi-cloud data warehouse platform provider, today announced that Barry Russell has been appointed Senior Vice President, Business Development and General Manager of Cloud, effective immediately. In this newly created position, Russell is responsible for accelerating Teradata’s cloud and SaaS growth strategy. Russell, who has joined the Company’s executive leadership team, reports directly to Steve McMillan, President and CEO.

Barry Russell 280x400 Teradata Adds Cloud Talent, Appoints Barry Russell as SVP and GM, Cloud“Cloud-first is the cornerstone of Teradata’s strategic transformation and there is no better leader than Barry to ensure we are moving our cloud strategy forward with agility and purpose,” said McMillan. “He will serve as an advisor to all organizations across the company, driving alignment on our cloud growth strategies, advancing our execution and scaling our cloud business. A strong and strategic partner ecosystem with leading cloud service providers is critical to our growth as well, and Barry will also focus on fostering high-impact relationships that bring customers choice. I am confident that Barry’s expertise will advance our multi-cloud platform and drive an even stronger value proposition for our customers and our shareholders alike.”

Russell is a skilled and recognized technology leader with an extensive background driving cloud transformations. Prior to joining Teradata, he led Qumulo’s cloud business unit, and before that, he held senior leadership positions at F5 and Amazon Web Services. At Teradata, Russell will be leading the development and execution of Teradata’s cloud programs, as well as advancing collaboration with leading cloud service providers, including Amazon Web Services, Google Cloud, and Microsoft Azure.

“I am passionate about building and transforming organizations using the cloud to do exceptional things for customers,” said Russell. “Teradata has evolved quickly utilizing the cloud, enabling us to innovate fast on the cloud data analytics needs of enterprises worldwide. Our advantages in today’s multi-cloud world are already helping customers migrate data workflows to the cloud and create data intelligence real-time across their business, and I am ready to help Teradata grow quickly with our customers and partners.”

About Barry Russell
Russell most recently served as Senior Vice President and General Manager of Cloud at Qumulo, responsible for the company’s cloud business including strategy, revenue, and go-to-market execution. Previously, he was Vice President of F5’s global cloud revenue and alliances business. Before that, Russell was GM at Amazon Web Services (AWS), where he built and ran the AWS Marketplace and AWS Service Catalog business, growing AWS Marketplace from scratch into a $ 1b+ business with responsibility for its business development and operations, marketing, field sales and solution architecture. Prior to that, he spent nearly six years at Microsoft in business development and enterprise sales positions. 

Russell is a member of the Advisory Board for Women in Cloud, Tribute, and the Fred Hutch Computational/Data Intensive Research.

Let’s block ads! (Why?)

Teradata United States

Read More

Make the Most of Your Existing IT and Business Resources With Cloud Integration

February 21, 2021   TIBCO Spotfire
TIBCO CloudIntegration scaled e1612911747850 696x365 Make the Most of Your Existing IT and Business Resources With Cloud Integration

Reading Time: 2 minutes

In 2021, many businesses are working to reset and readjust to new market conditions. This readjustment can involve any number of goals, including returning to revenue growth, cutting costs, boosting innovation, or adapting to new business models. In all of these cases, modernizing a business’s digital platform is a key element to accomplishing any of these goals. 

Regardless of industry, most IT teams must achieve these aims while working within a set budget. As a result, it is essential that they find ways to extract more value from existing business assets and resources. 

This is where TIBCO Cloud Integration comes in. It maximizes flexibility and choice so you can make the most efficient use of your investments.

Accelerate the Creation of Integrations

Your IT team probably has more integration requests than they have people to service them, leading to long wait times. However, the average business user should be able to create integrations if provided with the right tools. These users also have a deeper understanding of business needs and what will be required to develop effective integrations. TIBCO Cloud Integration provides self-service, low-code tools that enable people outside of the IT department to create integrations in a self-service environment. This reduces the IT bottleneck by spreading integration work across the business, allowing the IT department to focus on complex, critical integrations. 

Make More Efficient Use of Compute Resources

With TIBCO Cloud Integration, you can develop event-driven integrations that consume 50x less compute resources than alternative Java or Node.js approaches — making them ideal for deployments to lightweight edge devices in serverless environments such as function-as-a-service (FaaS). This allows you to make the most efficient use of your compute resources. It also increases your scalability by reducing the start-up and shut-down time of your applications. 

Reduce the Risk of Business Disruption 

You can further minimize disruption to your critical business processes with tools that monitor and evaluate the performance of the integrations that distribute information across your business. The centralized monitoring dashboard within TIBCO Cloud Integration lets you easily evaluate your integrations’ performance to ensure that critical applications are running smoothly. It also helps you find opportunities to improve application performance and investigate sudden changes  that need to be addressed quickly. For example, the demo below shows how TIBCO Cloud Integration can easily compare versions of an application to rapidly identify recent changes that may be causing issues.

For more information on the value TIBCO Cloud Integration can provide, check out this short video. You can also head to our YouTube channel for  product demos and more, including a detailed demonstration of the TIBCO Cloud monitoring dashboard.

Let’s block ads! (Why?)

The TIBCO Blog

Read More

The Hertz Corporation Selects Teradata Vantage in the Cloud for Scalable & Elastic Analytics Environment

February 10, 2021   BI News and Info
hertz moves to teradata vantage on aws.jpg?width=640&height=336&ext= The Hertz Corporation Selects Teradata Vantage in the Cloud for Scalable & Elastic Analytics Environment

One of the World’s Largest Vehicle Rental Companies Selects Vantage on Amazon Web Services to Modernize Data Analytics Ecosystem

Teradata (NYSE: TDC), the cloud data analytics platform company, today announced that The Hertz Corporation is moving its entire Teradata Global Data Warehouse to Teradata Vantage in the cloud – delivered as-a-service, on Amazon Web Services. Continuing its longstanding relationship with Teradata, Hertz selected the company’s cloud offering to modernize its business without sacrificing the unmatched scale, speed, and analytic sophistication of the Vantage platform.

The Hertz Corporation, a subsidiary of Hertz Global Holdings, Inc., is one of the largest global rental companies, and the Hertz brand is one of the most recognized in the world. The company operates the Hertz, Dollar and Thrifty vehicle rental brands throughout North America, Europe, the Caribbean, Latin America, Africa, the Middle East, Asia, Australia and New Zealand.

“As part of our ongoing business transformation to modernize our technology and reduce costs, we are migrating our legacy analytics environment to the cloud to take advantage of the elasticity and scalability that it provides,” said Manish Agarwal, Vice President, Data & Analytics at The Hertz Corporation. “We selected Teradata Vantage on AWS to be a fast and low risk path to the cloud. We were pleased that the migration effort was timely, on budget, and that performance has exceeded expectations. This move fits well with our strategy and compliments our other on-going efforts to migrate to the cloud.”

With Vantage delivered as-a-service on AWS, The Hertz Corporation can holistically manage its complex, global inventory to ensure optimal customer service and financial performance and derive business-critical insights across its organization at speed and scale. By comprehensively tracking and analyzing transaction data across its global locations, Hertz is also able to create value-added programs that drive revenue and profitability, and result in technology savings.

“We are thrilled that Hertz decided to migrate their on-premises legacy Global Data Warehouse to Teradata Vantage in the Cloud as part of the company’s enterprise-wide modernization efforts – but we’re even prouder that this complex migration was such a success from a budget, timing and performance perspective,” said Ashish Yajnik, SVP Vantage Cloud at Teradata. “The ability for Hertz to cost-effectively query data from across their global operations, at the speed and scale they’ve come to expect from the Vantage platform, is a validation of this modernized technology platform.”

“The technical success of this initiative was due in large part to an effective partnership between Hertz leadership, Teradata and the broader Hertz implementation team — all of whom were committed to the success of this project, as well as the on-time, on-budget delivery,” said Opal Perry, Executive Vice President and Chief Information Officer at Hertz. 

Teradata Vantage is the leading multi-cloud data analytics software platform that enables ecosystem simplification by unifying analytics, data lakes and data warehouses. With Vantage, enterprise-scale companies can eliminate silos and cost-effectively query all their data, all the time, regardless of where the data resides – in the cloud using low-cost object stores, on multiple clouds, on-premises or any combination thereof – to get a complete view of their business. And by combining Vantage with first-party cloud services, Teradata enables customers to expand their cloud ecosystem with deep integration of cloud-specific, cloud-native services.

Let’s block ads! (Why?)

Teradata United States

Read More

Move Your High-performance Computing Environment to the Cloud with TIBCO and Microsoft

February 5, 2021   TIBCO Spotfire
TIBCO Gridserver scaled e1611949936749 696x365 Move Your High performance Computing Environment to the Cloud with TIBCO and Microsoft

Reading Time: 2 minutes

We’ve got a New Years’ resolution for you: Begin modernizing your grid applications. Running on Microsoft Azure, the TIBCO GridServer® solution provides performance, speed, and analytics benefits without the time, effort, and cost associated with maintenance, software upgrades, facility overhead, and personnel.

Key Benefits of TIBCO GridServer on Microsoft

TIBCO GridServer is a market-leading infrastructure platform for grid and elastic computing—and the backbone of businesses operating in the world’s most demanding markets. By combining Microsoft Azure and GridServer software, your organization can burst workloads to the cloud when the demand arises and reap the following benefits:

  • Accelerated computation-intensive applications: GridServer on Microsoft enables mission-critical risk management, complex pricing, and other computation-intensive applications to quickly execute. With this pairing, computational-intensive applications are processed in minutes or seconds rather than hours.
  • Elastic autoscaling, monitoring, and security: On Microsoft Azure, TIBCO GridServer infrastructure provides powerful, elastic grid computing and scalability. It can run millions of tasks using tens of millions of data points in parallel, allowing for better performance, quicker results, and more accurate analyses.  
  • Improves market risk management systems: GridServer infrastructure also provides the power and speed required for risk calculations and risk-related tasks. Potential use cases include asset pricing, high-frequency trading, trader limit management, and risk reporting.

TIBCO GridServer and Microsoft Azure in Action

One European financial institution with hundreds of software engineers uses GridServer infrastructure extensively when making risk and trading decisions. The engineering team deploys 200,000 virtual cores in multiple grids with many versions of Windows and Linux, providing massive compute power for real-time quantitative analysis and recommendations. Using TIBCO GridServer infrastructure on Microsoft Azure, the bank runs millions of calculations in half the time at 60 percent of the cost.

Using TIBCO GridServer infrastructure on Microsoft Azure, the bank runs millions of calculations in half the time at 60 percent of the cost. Click To Tweet

Register Now to Learn More

If modernizing your high-performance computing (HPC) environment on the cloud is in one of your 2021 resolutions, then joining TIBCO and Microsoft for an upcoming session on March 4th is a great way to start off the year. Technology experts will demonstrate how banks and insurance companies can quickly execute mission-critical services such as risk management, complex pricing, and other computation-intensive applications to decrease time-to-results and increase performance levels. Register now!

Let’s block ads! (Why?)

The TIBCO Blog

Read More
« Older posts
  • Recent Posts

    • Accelerate Your Data Strategies and Investments to Stay Competitive in the Banking Sector
    • SQL Server Security – Fixed server and database roles
    • Teradata Named a Leader in Cloud Data Warehouse Evaluation by Independent Research Firm
    • Derivative of a norm
    • TODAY’S OPEN THREAD
  • Categories

  • Archives

    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited