• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Needs

Why Healthcare Needs New Data and Analytics Solutions Before the Next Pandemic

February 28, 2021   TIBCO Spotfire
TIBCO Data in Healthcare scaled e1614362426384 696x366 Why Healthcare Needs New Data and Analytics Solutions Before the Next Pandemic

Reading Time: 3 minutes

Despite an increase in healthcare funds, hospitals are still unprepared for major global crises. To improve patient care and better weather the next pandemic, healthcare providers can invest in better data and analytics. 

Healthcare Data, An Untapped Opportunity

According to the World Economic Forum, hospitals on average produce 50 petabytes of data per year, which equates to 50 million gigabytes or the space of 50 million CDs. That’s a lot of data. Yet hospitals waste about 97 percent of this data when they could transform it to drive better healthcare outcomes. 

How can healthcare providers unlock this data through predictive models and unified data platforms so they can improve patient care and save more lives? 

What are some of these data-related opportunities?

  • With patient data spanning numerous diagnostic and care systems, greater data alignment and sharing can provide the complete picture of the patient that doctors require. This can accelerate diagnosis and treatment, and yield healthier outcomes
  • With better access and insights from their data, providers can better personalize patient care, saving everyone time and money.
  • And to stay ahead of outbreaks, healthcare analysts can leverage data and analytics to predict and prepare for future healthcare situations.

Intercepting Medical Emergencies Before They Happen 

One example of data at work is the University of Chicago Medicine’s (UCM) predictive models. UCM implemented an eCART solution that uses streaming analytics and a predictive algorithm to predict when a cardiac arrest is most likely to occur. Before a patient goes into cardiac arrest, a doctor is notified and can administer care to avoid a serious complication. UCM has been able to reduce the number of cardiac arrests in the hospital by an estimated 15 to 20 percent. UCM also uses TIBCO solutions to streamline its data for better data management between healthcare and insurance providers.

Predictive Healthcare Analytics Saves Lives

Post-surgical infections cause rehospitalization, often leading to increased healthcare costs, complications, and patient death. The University of Iowa Hospital and Clinics, using accessible analytics, reduced surgical infections by 74 percent. Combining patient care and historical data, the hospital developed a predictive model in a real-time environment, enabling faster and accurate decision-making, reducing costs by $ 2.2 million.  

Predictive analytical models like the ones featured above are critical during a mass pandemic. Due to COVID, many clusters of the population are going through monthly testing, which is time-sensitive and costly. The estimated annual cost of COVID-19 diagnostic tests was at least $ 6 billion, and in the case of a mass testing scenario, the costs jumped to $ 25 billion. By predicting the rates and likelihood of infection, hospitals can better prepare testing sites where they are needed, reducing the number of unneeded tests and the associated costs.

Currently, COVID testing is not always efficient or accurate enough for optimal diagnosis and care. According to a recent survey by the Annals of Internal Medicine, on the day of symptom onset, the median false-negative rate was 38 percent. It decreased to 20 percent on the third day and then rose to 66 percent in the following days. False-negatives result in sick individuals infecting more of the population. More data and better analytical models can help improve testing efficacy before the next health crisis.

Transforming Data Ahead of Future Global Crises

Serving seemingly endless demand for high-quality care will always be top of the list for healthcare providers. Investments in more unified patient data and more intelligent analytics can significantly improve patient care, every day and in times of crisis. 

By predicting the rates and likelihood of infection, hospitals can better prepare testing sites where they are needed, reducing the number of unneeded tests and the associated costs. Click To Tweet

To learn more about TIBCO’s role in making greater data-driven insight a reality for healthcare providers, check out this recent webinar. And to learn more about data’s role in patient care, stay tuned for an upcoming blog from Dennis MacLaughlin.

Let’s block ads! (Why?)

The TIBCO Blog

Read More

New Customer Experience Needs and Commerce Trends for 2021

February 24, 2021   CRM News and Info

By Jack M. Germain

Feb 24, 2021 5:08 AM PT

As consumers get comfortable with their newfound digital wallets and gift cards, marketers must continue to adapt their strategies to changes in shopping behavior to better finesse the customer experience.

Both consumers and vendors have had non-stop adjustments. Lockdowns and social distancing requirements accelerated the adoption of new technologies. Commerce trends that were on the horizon pre-COVID-19 were suddenly adopted at a brisk pace. Online food ordering, curbside pickup, and BOPIS (buy online, pick up in store) are presenting new challenges to store owners and brand marketers.

Commerce analysts do not see consumers shedding their newfound buying options in the wake of a post-pandemic marketplace. Concerns for health safety, social distancing, and remote working will remain as the center stage in the daily lives of millions of shoppers.

So brands must continue to assess how they can best meet the dramatically changing landscape of commerce. How brands deliver customer experience (CX) will determine where and how consumers continue to shop.

Four trends about customer experience and the new commerce will define 2021 and beyond, according to Jennifer Conklin, sector lead of unified commerce at Capgemini North America. Contactless customer experience, omnichannel shopping, personalization and changing customer journeys, and voice commerce will power the customer experience engine going forward.

“Consumer shopping and spending behavior have significantly shifted since the pandemic began in March 2020. Recent Capgemini research showed that 48 percent of holiday season purchases were for essential items, with consumers prioritizing clothing (36 percent), beauty/personal care products (21 percent), and electrical items (21 percent),” Conklin told CRM Buyer.

As for luxury products, Capgemini research showed that 47 percent of consumers expected a decrease in spend over the holidays while 29 percent predicted an increase in luxury purchases, she noted.

New Normal Sales Tools

Not all analysts are confident that consumers will ever return to brick-and-mortar stores as their primary shopping suppliers. Conklin is sure the four commerce drivers she identified have staying power. Her reasons make sense.

A contactless customer experience is one of the main demands indicated by consumers. Retailers that rolled out simple curbside options during the pandemic will put high-tech BOPIS and curbside offerings in place. Many shoppers still do not want to linger and browse in-store.

Omnichannel shopping has proven its value to shoppers looking for reliable delivery and better pricing options. Merchants who demonstrate that they are able to quickly get products to consumers, resolve issues with customer service, and provide fast delivery and returns will be the ones that thrive.

Personalization and changing customer journeys are the new sales tools. As brands look to understand new customer journeys, they must get creative online to improve engagement and increase customer loyalty, Conklin suggested.

Voice Commerce is the shopping tool just as voice commands are finding new uses in smart homes and electronic gadgets. Retailers will try to figure out how they can use voice to make the customer experience even more engaging.

Safety is still at the forefront of consumer concerns, noted Conklin. Last year, Capgemini research revealed that 77 percent of consumers expect to increase their use of touchless technologies to avoid interactions that require physical contact.

Her company’s research found that 59 percent of consumers prefer to use voice interfaces in public places during the pandemic. Researchers do not expect that percentage to shrink in a post-pandemic era.

“If this is not on merchants’ 2021 digital road maps, it needs to be added,” she urged.

The Journey Counts

To better understand the changing directions of shopping journeys, merchants should inject more effort by adding a personalization element, according to Conklin. This helps the customer feel known and valued as they make their purchasing decisions.

“While there are several degrees of personalization capabilities, merchants can start small by incentivizing customers to create account profiles and fine-tune their segmentation efforts so the organization can reach out to the customer with the right message at the right time,” she offered.

Data also plays an integral role when it comes to the success of personalization and omnichannel efforts. Companies need to ensure they are working with one central view of their customer across the organization from sales, service, marketing, and commerce, Conklin said.

“No matter who in the company is communicating with the customer, they need to have the relevant data at their fingertips to be successful in their role. This is also critical in order to deliver a consistent, seamless experience to the customer across every touchpoint in the customer journey,” she explained.

This data and direct customer feedback can influence product sets as well. That enables retailers to further refine their inventory strategies cross-channel/cross-market, she added.

What’s Ahead

Consumers’ modes of interaction and habits have changed and are continuing to do so as we adapt to the “new normal,” according to Durk Stelter, CRO of Linc, a CX automation platform provider in Sunnyvale, Calif. This year will bring further transition and change to retail. Shoppers’ expectations continue to rise for anywhere, anytime interactions with brands.

“As the fixed boundary between workplace and home has eroded, so has the divide between daytime computer use and leisure time on mobile. Amidst overlapping worlds, digital shopping has become omnipresent and around the clock — shifting among devices and following shoppers around their homes, into their cars, and on their cautious forays into the outside world,” Stelter told CRM Buyer.

These trends will stick for now. However, as stores reopen, the trends will likely evolve, creating more cohesion between the online and in-store experience, suggested Shelly Socol, co-founder of 1R, a digital commerce and retail strategy agency in New York City. The online buying experience will continue to evolve and grow so it is inevitable that the trends will morph.

“However, the trends we are seeing today are ahead of their time due to the pandemic. Both merchants and consumers have progressed by leaps and bounds over the past year, Socol told CRM Buyer.

Merchants have been forced to build more robust shopping experiences and offer high-touch customer service. Consumers, on the other hand, have had to get used to shopping online more often, she described.

“What might have been once foreign and uncomfortable for them has become a standard, and it is likely consumers will not revert back to shopping only in-store even when they are fully open,” predicted the 1R co-founder.

Differing CX Realities

Managing CX is becoming different now for in-store commerce versus e-commerce, according to Capgemini’s Conklin. In-store traffic remains at an all-time low. But e-commerce channels have invested heavily in robust customer experience capabilities.

“Since customers do not want to browse and shop in-store, the online digital experience needs to mimic the in-store experience. This means intuitive navigation, detailed product pages with full imagery, and personalized technology to foster loyalty,” advised Conklin.

Brands and retailers will also start to invest more in immersive technologies to bring products to life and embed this functionality into their sites, she noted. This will enable customers to configure products using a 3D configurator, augmented reality, or virtual photography.

“Once the pandemic subsides, the in-store shopping experience will return and likely be more immersive than ever before. Stores will likely carry less inventory and allocate space to be utilized for unique and engaging experiences such as product demonstrations, classes, spa treatments, cafes, and so much more where customers can spend time in-store,” she predicted.

CX in general has dramatically changed since the pandemic started. Customers expect 24/7 individualized personalization support on everything from pre-purchase information, to order support, returns, and loyalty and membership information, observed Linc’s Stelter.

“The rising degree of difficulty for customer service interactions requires organization-wide responsiveness and flexibility. As brands increasingly turn to automated solutions to help manage the volume of inquiries, the quality of digital-human interactions is crucial,” he said.

Create Seamless Shopping Experiences

A primary consideration is changing how marketers use chatbots, as Stelter sees it. To meet the challenges of 2021, digital interactions must be adaptive and empower the consumer to drive the conversation.

In order to improve their customer experience, merchants need to focus on accessibility, noted Meghan Brophy, retail and e-commerce analyst at Fit Small Business. That is one aspect of online shopping that has been neglected for too long.

“To truly offer a great customer experience, merchants need to make online shopping accessible to all. Simple changes like labeling form fields, adding alt text to images, and not using strikethroughs to show sale prices can make a big difference,” she told CRM Buyer.

More important than ever is for consumers to have a seamless shopping experience. Shoppers are starting and completing buying journeys using a mix of channels, and they all need to work together smoothly, explained Brophy.

For example, a customer might start on a brand’s Instagram page and add items to the cart. Then the customer visits the website later to complete the purchase and picks up the order in-store.

Many options exist for brands to maximize the online customer experience. Helpful and fast customer service is key, in addition to free shipping and easy returns. SMS is also a rising form of communication with consumers and is becoming a must, offered 1R’s Socol.

Brands should also build and utilize flexible landing pages populated with both content and products. These pages can form the foundation for marketing purposes to drive traffic. Brands can create storytelling experiences that complement the website and allow the brand to produce unique content for different target audiences.
end enn New Customer Experience Needs and Commerce Trends for 2021


Jack%20M.%20Germain New Customer Experience Needs and Commerce Trends for 2021
Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open-source technologies. He is an esteemed reviewer of Linux distros and other open-source software. In addition, Jack extensively covers business technology and privacy issues, as well as developments in e-commerce and consumer electronics. Email Jack.

Let’s block ads! (Why?)

CRM Buyer

Read More

What your business needs to know about CPRA

November 8, 2020   Big Data
 What your business needs to know about CPRA

Maintain your employer brand in a pandemic

Read the VentureBeat Jobs guide to employer branding

Download eBook

After achieving a narrower than expected mandate of 56% on November 3, the California Privacy Rights Act (CPRA) has now passed. This new act overhauls the preexisting California Consumer Privacy Act (CCPA) and is a landmark moment for consumer privacy.

In essence, the CPRA closes some potential loopholes in the CCPA – but the changes are not uniformly more stringent for businesses (as I’ll show in a moment). It also moves California’s data protection laws closer to the EU’s GDPR standard. When the CPRA becomes legally enforceable in 2023, California residents will have a right to know where, when, and why businesses use their personally identifiable data. With many of the world’s leading tech companies based in California, this act will have national and potentially global repercussions.

The increased privacy is undoubtedly good news to consumers. But the act’s passage is likely to create concern among businesses that depend on customer data. With stricter enforcement, harsher penalties, and more onerous obligations, many companies are likely to wonder whether this new law will make operating more difficult.

While many of the finer details of the CPRA are likely to change before it becomes enforceable, here’s what your business needs to know right now.

Will you be subject to the CPRA?

The preexisting CCPA law applied only to businesses that:

1) had more than $ 25 million in gross revenue

2) derived 50% or more of their annual revenue from selling consumers’ personal information, or

3) bought, sold, or shared for commercial purposes the personal information of 50,000 or more consumers, households, or devices.

The CPRA keeps most of these requirements intact but makes a few changes. First, the revenue requirement (point 1 above) is now clearer: A company must have made $ 25 million in gross revenue in the previous calendar year to become subject to the law.

Second, when it comes to personal information (point 2), sharing is now considered the same as selling. While the CCPA applied to businesses that made more than half their revenue from selling data, the CPRA now also applies to companies that make half their revenue from sharing personal information with third parties.

Finally, point 3 is now more lenient, with the threshold for personal information-based businesses raised from 50,000 consumers, households, or devices to 100,000.

For businesses wondering if they can avoid regulations for sister companies under the same brand, the CPRA has clarified what the term “common branding” means. The CPRA now defines “a shared name, service mark, or trademark, such that the average consumer would understand that two or more entities are commonly owned.”

It also specifies that a sister business will fall under the CPRA if it has “personal information shared with it by the CPRA-subject business.” In practical terms, this means that two related businesses (one of which is subject to the CPRA) that might share a trademark but be different legal identities, will be subject to the CPRA only if they share data. The same joint responsibility for consumer information also applies to partnerships where a shared interest of more than 40% exists, regardless of branding.

So with the CPRA, some businesses are now more likely to become subject to data protection legislation while others may no longer fall under the Californian legislation.

For organizations that operate multiple legal entities, it is still ideal to have a one-size-fits-all approach to consumer data privacy. By allowing non-subject businesses to self-certify that they are compliant, the CPRA also gives companies an opportunity to be transparent with their customers about data usage even if they do not necessarily need to be.

Consumers have a right to know why you’re collecting their ‘sensitive personal information’

The CPRA will give consumers additional rights to determine how businesses use their data. As well as receiving the right to correct their personal information and know for how long a company might store it, under the CPRA, consumers will be able to opt-out of geolocation-based ads and of allowing their sensitive personal information to be used.

The concept of “sensitive personal information” is itself a new legal definition created by the CPRA. Race/ethnic origin, health information, religious beliefs, sexual orientation, Social Security number, biometric/genetic information, and personal message contents all fall under this definition.

Businesses also need to be careful when it comes to dealing with data they have already collected. Suppose a company plans to reuse a customer’s data for a purpose that is “incompatible with the disclosed purposes for which the personal information was collected.” In that case, the customer needs to be informed of this change.

Similarly to the CCPA, employee data now falls under the CPRA. While this won’t be legally enforceable until 2023, one stipulation of the CPRA is that businesses will need to be transparent with their staff regarding data collection.

Businesses will soon need to give consumers more comprehensive opt-out abilities whenever they interact with them, but it may still take a while before unified standards around these procedures become commonplace. Undoubtedly there will be more than one way to communicate consumer requirements within the CPRA framework. Besides opt-out forms, businesses may increase their use of the Global Privacy Control standard, a browser add-on that simplifies opt-out processes. However, as geolocated targeting becomes more legally problematic, companies may need to reconsider reliance on some forms of targeted advertising.

There will be fines for data breaches

The CPRA stipulates that “businesses should also be held directly accountable to consumers for data security breaches.” As well as requiring businesses to “notify consumers when their sensitive information has been compromised,” the CPRA sets out financial penalties. Companies that allow customer data to be leaked will face fines of up to $ 2,500 or $ 7,500 (for data belonging to minors) per violation. The newly formed California Privacy Protection Agency will be authorized to enforce these fines.

While in the short term, a relatively limited budget is likely to mean the agency will undertake only a few large scale instances of legal action, every business will face increased financial risk related to data breaches. As the CPRA raises the stakes for businesses regarding data protection, threat actors are likely to be emboldened further. In the EU, the GDPR has been linked to increased ransomware incidences as hackers use the threat of fines as leverage to extract larger ransoms from their victims.

In this respect, compliance will mean adopting stronger organizational security postures through increased multi-factor authentication use and zero trust protocols. It is likely to drive up the costs of cybersecurity business insurance as well.

You have until 2023 but shouldn’t delay

While the CPRA will not become law until January 1, 2023, its regulations will apply to all information collected from January 1, 2022, onwards. So, as of now, you have over two years to prepare. However, as seen in polls from earlier this year, the vast majority of businesses have yet to comply with even currently-enforceable CCPA legislation.

The timeline for compliance with CPRA is relatively generous. As both regulators and businesses rush to catch up with their new obligations, it is unlikely that companies will face a torrent of legal action in the short term.

Nevertheless, in the longer term, the CPRA is likely to drive further legislation across the US. This law may be the beginning of a push towards federal-level data protection regulations, which will have similar rules, requirements, and penalties for businesses, regardless of where their customers are. Companies should start preparing for a future where customer data is legally protected now.

Rob Shavell is a cofounder and CEO of onine privacy company Abine / DeleteMe and has been a vocal proponent of privacy legislation reform, including as a public advocate of the California Privacy Rights Act (CPRA).


How startups are scaling communication:

The pandemic is making startups take a close look at ramping up their communication solutions. Learn how


Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

AI needs systemic solutions to systemic bias, injustice, and inequality

July 18, 2020   Big Data
 AI needs systemic solutions to systemic bias, injustice, and inequality

VB Transform

Watch every session from the AI event of the year

On-Demand

Watch Now

Watch all the Transform 2020 sessions on-demand right here.


At the Diversity, Equity, and Inclusion breakfast at VentureBeat’s AI-focused Transform 2020 event, a panel of AI practitioners, leaders, and academics discussed the changes that need to happen in the industry to make AI safer, more equitable, and more representative of the people to whom AI is applied.

The wide-ranging conversation was hosted by Krystal Maughan, a Ph.D. candidate at the University of Vermont, who focuses on machine learning, differential privacy, and provable fairness. The group discussed the need for higher accountability from tech companies, inclusion of multiple stakeholders and domain experts in AI decision making, practical ways to adjust AI project workflows, and representation at all stages of AI development and at all levels — especially where the power brokers meet. In other words, although there are systemic problems, there are systemic solutions as well.

Tech company accountability

The old Silicon Valley mantra “move fast and break things” has not aged well in the era of AI. It presupposes that tech companies exist in some sort of amoral liminal space, apart from the rest of the world where everything exists in social and historical contexts.

“We can see all around the world that tech is being deployed in a way that’s pulling apart the fabric of our society. And I think the reason why is because … tech companies historically don’t see that they’re part of that social compact that holds society together,” said Will Griffin, chief ethics officer at Hypergiant.

Justin Norman, vice president of data science at Yelp, agreed, pointing out the power that tech companies wield because they possess tools that can be incredibly dangerous. “And so not only do they have an ethical responsibility, which is something they should do before they’ve done anything wrong, they also have a responsibility to hold themselves accountable when things go wrong.”

But, Norman added, we — all of us, the global community — have a responsibility here as well. “We don’t want to simply accept that any kind of corporation has unlimited power against us, any government has unlimited power over us,” he said, asserting that people need to educate themselves about these technologies so when they encounter something dubious, they know when to push back.

Both Griffin and Ayodele Odubela, a data scientist at SambaSafety, pointed out the strength of the accountability that communities can bring to bear on seemingly immovable institutions. Griffin called Black Lives Matter activists “amazing.” He said, “Those kids are right now the leaders in AI as well, because they’re the ones who identified that law enforcement was using facial recognition, and through that pressure on institutional investors — who were the equity holders of these large corporations — it forced IBM to pull back on facial recognition, and that forced Microsoft and Amazon to follow suit.” That pressure, which surged in the wake of the police killing of George Floyd, has apparently also begun to topple the institution of law enforcement as we know it by amplifying the movement to defund the police.

Odubela sees the specter of law enforcement’s waning power as an opportunity for good. Defunding the police actually means funding things like social services, she argues. “One of the ideas I really like is trying to take some of these biased algorithms and really repurpose them to understand the problems that we may be putting on the wrong kind of institutions,” she said. “Look at the problems we’re putting on police forces, like mental illness. We know that police officers are not trained to deal with people who have mental illnesses.”

These social and political victories should ideally lead to policy changes. In response to Maughan’s question about what policy changes could encourage tech companies to get serious about addressing bias in AI, Norman pulled it right back to the responsibility of citizens in communities. “Policy and law tell us what we must do,” he said. “But community governance tells us what we should do, and that’s largely an ethical practice.”

“I think that when people approach issues of diversity, or they approach issues of ethics in the discipline, they don’t appreciate the challenge that we’re up against, because … engineering and computer science is the only discipline that has this much impact on so many people that does not have any ethical reasoning, any ethical requirements,” Griffin added. He contrasted tech with fields like medicine and law, which have made ethics a core part of their educational training for centuries, and where practitioners are required to hold a license issued by a governing body.

Where it hurts

Odubela took these thoughts a step beyond the need for policy work by saying, “Policy is part of it, but a lot of what will really force these companies into caring about this is if they see financial damages.”

For businesses, their bottom line is where it hurts. One could argue that it’s almost crass to think about effecting change through capitalist means. On the other hand, if companies are profiting from questionable or unjust artificial intelligence products, services, or tools, it follows that justice could come by eliminating that incentive.

Griffin illustrated this point by talking about facial recognition systems that big tech companies have sold, especially to law enforcement agencies — how none of them were vetted, and now the companies are pulling them back. “If you worked on computer vision at IBM for the last 10 years, you just watched your work go up in smoke,” he said. “Same at Amazon, same at Microsoft.”

Another example Griffin gave: A company called Practice Fusion digitizes electronic health records (EHR) for smaller doctors’ offices and medical practices and runs machine learning on those records, as well as other outside data, and helps provide prescription recommendations to caregivers. AllScripts bought Practice Fusion for $ 100 million in January 2018. But a Department of Justice (DoJ) investigation discovered that Practice Fusion was getting kickbacks from a major opioid company in exchange for recommending those opioids to patients. In January 2020, the DoJ levied a $ 145 million fine in the case. On top of that, as a result of the scandal, “AllScripts’ market cap dropped in half,” Griffin said.

“They walked themselves straight into the opioid crisis. They used AI really in the worst way you can use AI,” he added.

He said that although that’s one specific case that was fully litigated, there are more out there. “Most companies are not vetting their technologies in any way. There are land mines — AI land mines — in use cases that are currently available in the marketplace, inside companies, that are ticking time bombs waiting to go off.”

There’s a reckoning growing on the research side, too, as in recent weeks both the ImageNet and 80 Million Tiny Images data sets have been called to account over bias concerns.

It takes time, thought, and expense to ensure that your company is building AI that is just, accurate, and as free of bias as possible, but the “bottom line” argument for doing so is salient. Any AI system failures, especially around bias, “cost a lot more than implementing this process, I promise you,” Norman said.

Practical solutions: workflows and domain experts

These problems are not intractable, much as they may seem. There are practical solutions companies can employ, right now, to radically improve the equity and safety in the ideation, design, development, testing, and deployment of AI systems.

A first step is bringing in more stakeholders to projects, like domain experts. “We have a pretty strong responsibility to incorporate learnings from multiple fields,” Norman said, noting that adding social science experts is a great complement to the skill sets that practitioners and developers possess. “What we can do as a part of our own power as people who are in the field is incorporate that input into our designs, into our code reviews,” he said. At Yelp, they require that a project passes an ethics and diversity check at all levels of the process. Norman said that as they go, they’ll pull in a data expert, someone from user research, statisticians, and those who work on the actual algorithms to add some interpretability. If they don’t have the right expertise in-house, they’ll work with a consultancy.

“From a developer standpoint, there actually are tools available for model interpretability, and they’ve been around for a long time. The challenge isn’t necessarily always that there isn’t the ability to do this work — it’s that it’s not emphasized, invested in, or part of the design development process,” Norman said. He added that it’s important to make space for the researchers who are studying the algorithms themselves and are the leading voices in the next generation of design.

Griffin said that Hypergiant has a heuristic for its AI projects called “TOME,” for “top of mind ethics,” which they break down by use case. “With thus use case, is there a positive intent behind the way we intend to use the technology? Step two is where we challenge our designers, our developers, [our] data scientists … to broaden their imaginations. And that is the categorical imperative,” he said. They ask what the world would look like if everyone in their company, the industry, and in the world used the technology for this use case — and they ask if that is desirable. “Step three requires people to step up hardcore in their citizenship role, which is [asking the question]: Are people being used as a means to an end, or is this use case designed to benefit people?”

Yakaira Núñez, a senior director at Salesforce, said there’s an opportunity right now to change the way we do software development. “That change needs to consider the fact that anything that involves AI is now a systems design problem,” she said. “And when you’re embarking upon a systems design problem, then you have to think of all of the vectors that are going to be impacted by that. So that might be health care. That might be access to financial assistance. That might be impacts from a legal perspective, and so on and so forth.”

She advocates to “increase the discovery and the design time that’s allocated to these projects and these initiatives to integrate things like consequence scanning, like model cards, and actually hold yourself accountable to the findings … during your discovery and your design time. And to mitigate the risks that are uncovered when you’re doing the systems design work.”

Odubela brought up the issue of how to uncover the blind spots we all have. “Sometimes it does take consulting with people who aren’t like us to point these [blind spots] out,” she said. “That’s something that I’ve personally had to do in the past, but taking that extra time to make sure we’re not excluding groups of people, and we’re not baking these prejudices that already exist in society straight into our models — it really does come [down] to relying on other people, because there are some things we just can’t see.”

Núñez echoed Odubela, noting that “As a leader you’re responsible for understanding and reflecting, and being self aware enough to know that you have your biases. It’s also your responsibility to build a board of advisors that keeps you in check.”

“The key is getting it into the workflows,” Griffin noted. “If it doesn’t get into the workflow, it doesn’t get into the technology; if it doesn’t get into the technology, it won’t change the culture.”

Representation

Not much of this is possible, though, without improved representation of underrepresented groups in critical positions. As Griffin pointed out, this particular panel comprises leaders who have the decision-making power to implement practical changes in workflows right away. “Assuming that [the people on this panel] are in a position to flat-out stop a use case, and say ‘Listen, nope, this doesn’t pass muster, not happening’ — when developers, designers, data scientists know that they can’t run you over, they think differently,” he said. “All of a sudden everyone becomes a brilliant philosopher. Everybody’s a social scientist. They figure out how to think about people when they know their work will not go forward.”

But that’s not the case within enough companies, even though it’s critically important. “The subtext here is that in order to execute against this, this also means that you have to have a very diverse team applying the lens of the end user, the lens of those impacted into that development lifecycle. Checks and balances have to be built in from the start,” Núñez said.

Griffin offered an easy-to-understand benchmark to aim for: “For diversity and inclusion, when you have African Americans who have equity stakes in your company — and that can come in the form of founders, founding teams, C-suite, board seats, allowed to be investors — when you have diversity at the cap table, you have success.”

And that needs to happen fast. Griffin said that although he’s seeing lots of good programs and initiatives coming out of the companies whose boards he sits on, like boot camps, college internships, and mentorship programs, they’re not going to be immediately transformative. “Those are marathons,” he said. “But nobody on these boards I’m with got into tech to run a marathon — they got in to run a sprint. … They want to raise money, build value, and get rewarded for it.”

But we are in a unique moment that portends a wave of change. Griffin said, “I have never in my lifetime seen a time like the last 45 days, where you can actually come out, use your voice, have it be amplified, without the fear that you’re going to be beaten back by another voice saying, ‘We’re not thinking about that right now.’ Now everybody’s thinking about it.”

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Everyone needs to see this.

June 14, 2020   Humor

Posted by Krisgo

via

Like this:

Like Loading…

About Krisgo

I’m a mom, that has worn many different hats in this life; from scout leader, camp craft teacher, parents group president, colorguard coach, member of the community band, stay-at-home-mom to full time worker, I’ve done it all– almost! I still love learning new things, especially creating and cooking. Most of all I love to laugh! Thanks for visiting – come back soon icon smile Everyone needs to see this.


Let’s block ads! (Why?)

Deep Fried Bits

Read More

The AI community says Black Lives Matter, but more work needs to be done

June 6, 2020   Big Data
 The AI community says Black Lives Matter, but more work needs to be done

This week, as thousands of protestors marched in cities around the U.S. to bring attention to the death of George Floyd, police brutality, and abuses at the highest levels of government, members of the AI research community made their own small gestures of support. NeurIPS, one of the world’s largest AI and machine learning conferences, extended its technical paper submission deadline by 48 hours. And researchers pledged to match donations to Black in AI, a nonprofit promoting the sharing ideas, collaborations, and discussion of initiatives to increase the presence of black people in the field of AI.

“NeurIPS grieves for its Black community members devastated by the cycle of police and vigilante violence. [We] mourn … for George Floyd, Breonna Taylor, Ahmaud Arbery, Regis Korchinski-Paquet, and thousands of black people who have lost their lives to this violence. [And we stand] with its black community to affirm that, today and every day, black lives matter,” the NeurIPS board wrote in a statement announcing its decision.

In a separate, independent effort aimed at spurring mentors to reach out to black researchers as they finalize their NeurIPS submissions, Google Brain scientist Nicolas Le Roux and Google AI lead Jeff Dean pledged to contribute $ 1,000 to Black in AI for every person who receives assistance.

For the AI community, acknowledgment of the movement is a start, but research shows that it — much like the rest of the tech industry — continues to suffer from a lack of diversity. According to a survey published by New York University’s AI Now Institute, as of April 2019, only 2.5% of Google’s workforce was black, while Facebook and Microsoft were each at 4%. The absent representation is problematic on its face, but it also risks replicating or perpetuating historical biases and power imbalances, like image recognition services that make offensive classifications and chatbots that channel hate speech. In something of a case in point, a National Institute of Standards and Technology (NIST) study last December found that facial recognition systems misidentify black people more often than white people.

VB Transform 2020 Online – July 15-17. Join leading AI executives: Register for the free livestream.

“Despite many decades of ‘pipeline studies’ that assess the flow of diverse job candidates from school to industry, there has been no substantial progress in diversity in the AI industry. The focus on the pipeline has not addressed deeper issues with workplace cultures, power asymmetries, harassment, exclusionary hiring practices, unfair compensation, and tokenization that are causing people to leave or avoid working in the AI sector altogether,” the AI Now Institute report concluded. “[AI] bias mirrors and replicates existing structures of inequality in the [industry and] society.”

Some solutions proposed by the AI Now Institute and others include greater transparency with respect to salaries and compensation, harassment and discrimination reports, and hiring practices. Others are calling for targeted recruitment to increase employee diversity, along with commitments to bolster the number of people of color, women, and other underrepresented groups at leadership levels of AI companies.

But it’s an uphill battle. An analysis published in Proceedings of the National Academy of Sciences earlier this year found that women and people of color in academia produce scientific novelty at higher rates than white men, but those contributions are often “devalued and discounted” in the context of hiring and promotion. And Google, one of the largest and most influential AI companies on the planet, reportedly scrapped diversity initiatives in May over concern about a conservative backlash.

As my colleague Khari Johnson recently wrote, many AI companies pay lip service to the importance of diversity. That was never acceptable, particularly considering that venture capital for AI startups reached record levels in 2018. But at this juncture, as Americans are forced to come terms with systemic racism, it seems downright inexcusable.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Data Virtualization Needs to be Part of your Data Integration Toolbox

May 10, 2020   TIBCO Spotfire
TIBCO DataVirtualizationToolbox scaled e1586802077798 696x365 Data Virtualization Needs to be Part of your Data Integration Toolbox

Reading Time: 3 minutes

Any craftsman worth their salt has a deep toolbox, knowing that the right specialty tool at hand makes all the difference when completing a high-quality job in a timely manner.

The same can be said for IT, and especially data engineers, responsible for providing data to business consumers. To perform their work, quickly and well, they need to have all the right tools in their data integration toolbox.

Right Tool, Right Job

A broad toolbox with the right combination of data integration capabilities can be extremely valuable when responding rapidly to data requests. It can mean the difference between frustrating delays or punctual satisfaction and better business results.

But there are a variety of data integration tools available today. And it is important to select the right tool if you want to get the job done successfully. To help data engineering leaders ensure their team’s toolbox has everything they need, there are 7 recognized styles for data delivery:

●  Bulk/batch (ETL, ELT)

●  Replication

●  Messaging

●  Data services orchestration

●  Synchronization

●  Data virtualization

●  Streaming data integration

Contrary to what some may believe, it’s not about selecting just one of the above tools and assuming it will solve your data integration needs. Many companies will need a combination of data delivery styles to find success—a data integration toolbox that meets their various business goals. According to Gartner, “By 2021, more than 80 percent of organizations will use more than one data delivery style to execute their data integration use cases.”(1)

Why Include Data Virtualization in Your Toolbox

Of all those different data delivery styles, data virtualization is one of the most crucial ones and should be included in your toolbox. Data virtualization is a very useful tool that augments the existing data integration tools that you may already have. It provides a pragmatic integration approach that organizations are increasingly adopting to use along with their favorite ETL/ELT and replication tools.

In fact, according to Gartner’s 2018 Market Guide for Data Virtualization, “Through 2022, 60 percent of all organizations will implement data virtualization as one key delivery style in their data integration architecture…In 2011, only 11% of surveyed organizations reported that they were utilizing data virtualization in a focused set of use cases.” (2)

Cloud Data Sharing is Driving Data Virtualization Demand

But let’s face it, most people try to get by with as few tools as possible. So even when they know about power saws, they won’t buy one until they are faced with a “big job” that requires hundreds of cuts. Cloud data sharing is proving to be just that kind of “big job.”

Why is that? Well, while data and workloads are rapidly moving to the cloud, many legacy systems remain on-premises. This creates a number of new cloud data integration patterns including:

●  Hybrid: Integrating on-premises and cloud data

●  Multi-instance: Integrating multiple data sources within a single cloud provider

●  Multi-cloud: Integrating data across multiple cloud providers

●  All of the above: Integrating on-premises, multi-instance, and multi-cloud 

With this “big job” of cloud data sharing, comes a need for the right tool to get the job done. Data engineering teams have found that data virtualization is the right integration tool across all these cloud integration patterns. Regardless of the on-premises and cloud topology, data virtualization provides complete, consistent, business-friendly data wherever it happens to live, without moving or replicating it. And even better, its virtualized, metadata-driven data views adapt quickly as your cloud topology changes.

To learn more about how data virtualization can help complete your data integration toolbox and improve your cloud data sharing, check out these educational data virtualization resources we’ve curated for you.

And stay tuned for my next blog on who can most benefit from data virtualization and how to get started!

(1) Gartner, Magic Quadrant for Data Integration Tools, Ehtisham Zaidi, Eric Thoo, Nick Heudecker, 1 August 2019
(2) Gartner Market Guide for Data Virtualization, Sharat Menon, Mark Beyer, Ehtisham Zaidi, Ankush Jain, November 16, 2018

Let’s block ads! (Why?)

The TIBCO Blog

Read More

Congress needs a subcommittee dedicated to contact tracing – here’s why

May 4, 2020   Big Data
 Congress needs a subcommittee dedicated to contact tracing – here’s why

Millions around the world are still suffering through the COVID-19 pandemic and, while experts think the worst will be behind us in 12-18 months, there’s no telling how long we will be feeling the effects of this health crisis. We’re in uncharted territory.

As a nation, we were wildly unprepared for this catastrophe and, by necessity, we are now forced to play catch-up as we try to contain it. As we try to stem the tide of infections, health and tech experts are turning to the idea of contact tracing to stymie the spread and help society return to a new normal even as the virus remains active.

Contact tracing at its core relies on sensitive personal data like location and health status, in turn making it extremely susceptible to violations of privacy by both the public and private sectors. To ensure consumer privacy standards are met and to create a program that is transparent, consensual, and successful, the United States federal government needs to act fast to create a temporary subcommittee in the Senate to regulate and oversee any contact tracing platform that is seeded to the general public.

Establishing a democratic system

Creating and managing a committee on something as crucial and private as personal data doesn’t need to be as daunting as it sounds. Congress has many helpful resources at its fingertips that could be tapped to quickly and efficiently organize an overseeing body that has consumers’ best interests in mind.

VB Transform 2020 Online – July 15-17, 2020: Join leading AI executives at VentureBeat’s AI event of the year. Register today and save 30% off digital access passes.

In fact, the Senate Judiciary Committee already has a Subcommittee on Privacy, Technology and the Law with members who are well versed in laws and policies regarding the collection, protection, use, and dissemination of commercial information by the private sector, including online privacy issues. Given this committee’s background and expertise, these bipartisan members could be used to make up a temporary Subcommittee on Contact Tracing in the Senate.

This subcommittee, in addition to any contact tracing program, should only last for the duration of the pandemic and would be decommissioned when a successful vaccine has been widely administered and the virus has been effectively suppressed. It could be funded as a part of a congressional coronavirus relief bill as a means to protect citizens and provide a viable solution to getting society back to a state of normalcy.

The recommended members would be tasked with upholding and enforcing certain principles for any proposed contact tracing program to ensure personal data is not mistreated or abused. For example, citizens would need to be able to see exactly what information is being collected on them and who it is being shared with, such as government officials or corporations, and this data could not be shared unknowingly with any other organization — public or private.

Additionally, these contact tracing technologies should be restricted to public health uses only and must be carried out in a decentralized approach. Any centralized database of precise person-to-person interactions or geo-location data is susceptible to abuse or data breach that could quickly put the general public in danger.

Finally, this program would need to be voluntary and fully transparent. If a citizen opts out, any information collected on them must be deleted immediately and all data collection must stop from that point forward.

To uphold and regulate these principles on a national scale, any organization creating a contact tracing program must work with the subcommittee to gain special approval through an application process before releasing their technology to consumers.

The subcommittee would have the authority of the federal government to administer clearly spelled out repercussions for any organization or individual that violates these principles or moves forward without explicit approval. For example, if a technology does not comply, the subcommittee would work with major technology providers such as Apple and Google and telecom companies like Verizon and AT&T to immediately disable it. The subcommittee would also have authority to administer any fines or sanctions on companies or individuals that break said principles or violate any terms of consumer privacy.

Encouraging adoption and finding a solution

For any contact tracing program to actually be successful, it needs to be widely adopted by the general public. To encourage this, the Subcommittee for Contact Tracing would also be tasked with working with public health experts — such as the Centers for Disease Control and Prevention (CDC) — to create a general public awareness campaign that would clearly communicate what contact tracing is, why it’s necessary to contain the virus, and what options individuals have to opt in or opt out. Similar to current CDC guidelines available online and on TV, any informational materials on the program would be widely distributed and consistently shared by state leaders nationwide to advocate for extensive participation.

With a transparent, well-defined, and voluntary program overseen by a Senate subcommittee like the one described above, Americans would not only have more trust in any contact tracing platform but would also feel more empowered to participate in order to improve the public’s health and safety. Additionally, any organizations developing these technologies would be held accountable if they put consumer privacy at risk or do not comply with the principles set in place to protect the general public.

In times of global crisis, we must put everything on the table for the greater good of humanity, but we must do so with the proper safeguards to protect the rights of American citizens.

Jeremy Tillman is President of Ghostery.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

The Main Street Lending Program: What Your Business Needs to Know

April 13, 2020   NetSuite

By Justin Biel, trends editor at Grow Wire

The Federal Reserve Board on April 9 released new details of the Main Street Lending Program, designed to deploy loans to “small and mid-sized businesses” — firms with up to 10,000 employees and less than $ 2.5 billion in annual revenue — affected by COVID-19.

Unlike the Small Business Administration’s Economic Injury Disaster Loan and the Paycheck Protection Program, both of which are currently open for application and aimed at smaller businesses, the Main Street program will be the first federal relief program also aimed at larger companies. (Note: If you already applied for a loan with the Paycheck Protection Program, you can also apply for a Main Street loan.)

The Federal Reserve Board first announced the program in a March 23 press release. However, unlike the relief programs aimed at smaller businesses, the program has yet to accept loan applications.

This article covers what we know so far. We’ll update it with new developments as they arise.

 What is the Main Street Lending Program?

After Congress allotted $ 454 billion to the Department of the Treasury to support the Fed’s programs in the wake of COVID-19, a number of lending programs developed to support businesses across the U.S. and stabilize the economy.

The Fed developed one such program, the Main Street lending program, to provide relief for both small and middle-market companies.

Mary Daly, president of the Federal Reserve bank of San Francisco, explained the rationale behind the program in an interview with Yahoo Finance.

“There is this whole swath of institutions that don’t access capital markets very easily and don’t qualify for small business lending,” she said. “So if you think about it, that’s a part of Main Street that still needs to be treated and directly employs millions of workers in the United States.”

 Who is eligible for the Main Street Lending Program? 

Per the Fed’s latest guidelines announced on Thursday, a business must meet the following requirements to qualify for a Main Street loan:

  • Have up to 10,000 employees or revenues of less than $ 2.5 billion. (Previous reports said businesses would need to have a minimum of 500 employees in order to apply, yet the Fed’s new guidelines don’t mention that.)
  • “Commit to make reasonable efforts to maintain payroll and retain workers.” (Again, previous reports provided more detail, saying that businesses who get loans would need to retain at least 90% of their workforces, at full compensation and benefits, until Sept. 30, 2020.)
  • “Follow compensation, stock repurchase, and dividend restrictions that apply to direct loan programs under The CARES Act,” the federal stimulus package that sparked the Main Street Lending Program. (Law firms have more information on those restrictions.)

Neither the Fed nor banks have yet to issue exact details regarding applicants and how to apply. Earlier reports based on analysis of The CARES Act stated that businesses who get Main Street loans will likely be required, at least on good faith, to rehire at least 90% of any employees laid off since January 2020.

Also, according to the Act, those that get loans will “make a good faith certification” including to-dos like not outsourcing jobs while repaying the loan and for two years afterward.

How much money can I get from the Main Street lending program?

CNBC reports that loans would have a minimum of $ 1 million and a maximum of either $ 25 million or an amount that “when added to the Eligible Borrower’s existing outstanding and committed but undrawn debt, does not exceed four times the Eligible Borrower’s 2019 earnings before interest, taxes, depreciation, and amortization,” whatever is less.

What are the terms of these loans?

CNBC reports that interest rates will be just 0.01%, plus 250-400 basis points with a four-year maturity. Principal and interest payments will be deferred for one year, announced the Fed.

Who serves as lenders for the Main Street lending program? 

Lenders for the Main Street program will be financial institutions, such as large banks. The government will purchase 95% of the loan while the financing institution will hold the other 5%.

How will I apply for a loan?

Businesses will apply directly with the lender, though neither the Fed nor banks have released details about the process. It’s usually recommended to apply for a loan at the bank you currently work with.

“The application process will certainly vary by bank – larger banks may leverage a portal to take in applications, while smaller banks may require an application and supporting documentation to be submitted to a specific branch or through email,” said Brandon Koeser, senior manager and financial services senior analyst with RSM US LLP, a middle-market consulting firm.

If your bank is using a portal to collect applications for the Paycheck Protection Program, it’s “more likely they’ll use a portal” for Main Street loan applications, he added.

When will I be able to apply?

The Fed’s much-anticipated Thursday announcement provided new details about the Main Street Lending Program, but it didn’t outline an application process.

Koeser acknowledged that “we are still in wait-and-see mode” with regard to a program launch date. It could be later rather than sooner, he added, referencing the fact that many banks weren’t ready to accept loan applications when the Paycheck Protection Program launched last week.

“In light of how the rollout of the Paycheck Protection Program has gone with the SBA and participating lenders, it is possible that the Fed will take all of those weeks to craft the necessary rules and lay out the application and funding process,” he said.

Is there anything I can do now to prepare?

Koeser expects the Main Street loan application to require similar documents to the Paycheck Protection Program.

To get prepared now, he recommends gathering:

  • Payroll filings reported to the IRS for 2019
  • Payroll records supporting your compensation expenses and total workforce numbers.

“The important thing to remember is that this stimulus is designed to get into the hands of those who need it,” Koeser added.

So, while explicit details of the paperwork required aren’t released yet, he recommends preparing any and all information related to payroll that supports your business’s need for a loan.

“The more detail you can pull together on your payroll costs, including IRS filings and payroll reports, the better it will be when it comes time to complete the application,” he added. “Information-gathering could prove to be the most time-consuming part of the application process, so gathering what information you have could save time when it really counts.”

Posted on Thu, April 9, 2020
by Barney Beal

Let’s block ads! (Why?)

The NetSuite Blog

Read More

What Every Business Leader Needs to Know About Python

January 27, 2020   TIBCO Spotfire
TIBCOPython scaled e1579815967154 696x366 What Every Business Leader Needs to Know About Python

Reading Time: 4 minutes

In this blog, written for logistics and operations management, marketing professionals, and other business leaders, you’ll learn a bit about why a scripting language, Python, may be invaluable for your analysts and data scientists, and how it can help them to get more meaningful insights faster. 

What’s Python, and why does it matter to your data analysts and data scientists—and to you?

Python is one of the world’s most popular programming languages, with wide adoption that’s growing at 27% year over year. Fueling this growth are data scientists, and the tools—many now leveraging Python—they rely on. With its rich libraries and ease of use, Python makes it easier to focus on solving problems rather than writing code. Because Python is open source, its community of users can contribute directly to making it better. 

Here’s why it matters for you. Python works so well for data science and machine learning (a key to automating processes and insights) in part because it can rapidly scale up or down, making it possible to quickly build, test, and iterate models. This enables data scientists to iterate for insights faster, and to build out predictive (“what will happen”) and prescriptive (“what would be best to happen”) analytics. These insights are the highest value use cases of analytics for the business, but they are also the most difficult insights to gain.

 What Every Business Leader Needs to Know About Python
Moving from descriptive to prescriptive analytics, gain a more significant competitive advantage and turn information into innovation. 

Faster insights lead to innovation and competitive advantage

In the following two examples, learn how the right tooling—including Python capability—provided data analysts and data scientists what they needed to accelerate time to insight and drive competitive advantage.

Anadarko, an oil and gas exploration and production company, wanted to increase value to its stakeholders by lowering operating costs and improving efficiency through new technology. According to Data Scientist Dingzhou Cao of Anadarko, “With open source, it would have taken at least 10 to 12 months to build a similar system…[now we can] see live data coming in, and we have plug and play analytics models to process the live data and generate results. Without having to make a phone call, the drillers get the message, ‘Please act in this manner.’…We believe this is going to give us an advantage, and profit for our shareholders.” With these new enhanced, interactive models, Anadarko was able to significantly speed up time to insight, directly improving operations and benefiting the company’s stakeholders. 

AA Ireland, specializing in home, motor, and travel insurance, wanted to get real-time predictions to identify the most profitable customer types, and chose to step beyond industry standard software for “something that was powerful and future-proof,” said Colm Carey, Chief Analytics Officer. “You don’t sit in an IT queue for a year and a half. You build a model yourself and generate a lot of revenue for the company. It’s that power.” As Carey noted, “Insurance has always had predictive models, but we would build something, and in three months, update it. [Now] data comes in and goes out to models seamlessly without disruption, basically providing real-time predictability.” Accelerating the building of these predictive models allowed analysts to think in real time and increase company profits. 

In both cases, combining advanced visual analytics with Python expressions and data science drove faster, more insightful decision models. Increasing the agility of their data analytics and data science teams was key to getting insights and business benefits faster.

Build a framework with your data science teams to optimize and innovate

When business leaders seek insights through data analytics and data science, a shared understanding across teams is necessary for insights and innovation to arise. Orchestrating data science, IT operations, and business operations to go beyond information into transformation requires: 

  • A regular cadence of communication
  • A shared understanding of high-value business goals
  • A culture that welcomes experimentation

Ask your data scientists what they need to slash time to insights, and what obstacles they face in helping your business outperform and outcompete. Listen to their perspective and challenge them to help you find the highest value insights—those that can make the impossible possible. 

And invite them to join us for a special webinar with Neil Kanungo, Data Scientist at TIBCO, as he shares how data scientists can use the power of Python and its open-source libraries to speed time to insights, walking through the setup, execution, and governance of Python for advanced analytics needs.

Let’s block ads! (Why?)

The TIBCO Blog

Read More
« Older posts
  • Recent Posts

    • The Easier Way For Banks To Handle Data Security While Working Remotely
    • 3 Ways Data Virtualization is Evolving to Meet Market Demands
    • Did you find everything you need today?
    • Missing Form Editor through command bar in Microsoft Dynamics 365
    • I’m So Excited
  • Categories

  • Archives

    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited