• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: plant

Plant Seeds for Future Growth: What to Expect at TIBCO Analytics Forum 2021

March 27, 2021   TIBCO Spotfire
TAF ad blog 2 e1613166044233 696x365 Plant Seeds for Future Growth: What to Expect at TIBCO Analytics Forum 2021

Reading Time: 3 minutes

With spring comes the promise of new beginnings and the blossoming of new life. This year, embrace the spirit of spring at the TIBCO Analytics Forum (TAF) 2021 by learning about new analytics and data management technologies and approaches and how to foster growth in the coming years. 

You can take the insights and knowledge harvested from TAF 2021 back to your organization and plant them as seeds for future growth.  

Plan ahead with the agenda at a glance

At this two-day virtual event, you’ll hear all about the latest innovations and developments currently shaping the future of analytics and how industry leaders are applying these solutions to overcome today’s most pressing data challenges. Attend TAF 2021 for engaging presentations on successful applications across industries, including energy, manufacturing, healthcare and life sciences, consumer goods, retail, transportation and logistics, and more. 

Explore the Agenda at a Glance for a breakdown of the two days awaiting you, including: 

  • Keynotes with TIBCO and inspiring guest speakers 
  • Breakout sessions with customers and partners on how to succeed across industries
  • Product trainings for beginners and advanced users on data management, visual analytics, data science, and streaming analytics
  • Virtual networking with peers and industry experts 
  • Spotfire deep dives and hackathon to test your knowledge and win exciting prizes 

For more on what to expect, check out the highlights from TAF 2019 on the TAF community homepage. And get a head start on upping your analytics knowledge by exploring the TIBCO Community Blog and Spotfire demo gallery.

Customize your experience with curated tracks

To make sure you get the most out of your TAF 2021 experience, you can even customize your own event calendar along one of the following content tracks:

  • Real-world Success: Learn how customers, partners, and product experts use analytics and data management solutions to turn data into insights that transform organizations and achieve results. Case studies and sessions will span industries such as oil and gas, renewable energy, manufacturing, pharmaceuticals, healthcare, and more. Use cases discussed will include digital transformation, business reinvention, customer intimacy, and operational excellence.
  • Tech Deep Dives: Learn “how the magic happens” through a technical lens. Product capabilities, customer and partner success, and architectural approaches will be in the spotlight. Products discussed will include TIBCO Spotfire®, TIBCO® Data Science, TIBCO® Streaming, TIBCO® Data Virtualization, TIBCO EBX™, TIBCO WebFOCUS®, and more. 
  • The Future of Analytics: Learn about the opportunities provided by the changing face of analytics. Sessions will cover how to connect best practices in governance with ethical artificial intelligence (AI) and ways to immerse yourself in the open subsurface data universe (OSDU) conversation and be part of an open future. Also, learn more about the “better together” vision for TIBCO and ibi products, and sharpen your focus on the product roadmaps that matter most to you.
  • Product Training and Demos: Build up your skills with TIBCO and ibi products. Learn how to use TIBCO EBX, TIBCO Data Virtualization, TIBCO Spotfire, TIBCO Streaming, TIBCO Data Science, and TIBCO WebFOCUS through lecture and product demonstrations. Get deeper knowledge into the advanced features and functions of Spotfire including IronPython scripting and Spotfire Mods.

Spring forward: register now!

Take a leap forward this spring. So much compelling content awaits you at TAF 2021. Join us and invest in growing the value of your analytics and data management programs. Register Now!

Plus, we are offering an academic discount of 50 percent off the full registration price! Just select “Government/Education” from the drop-down menu when you register to ensure you receive the discount.*

Take a leap forward this spring. So much compelling content awaits you at TAF 2021. Join us and invest in growing the value of your analytics and data management programs. Register Now! Click To Tweet

*To receive the discount, make sure to enter your educational institution email address. If you don’t have one, please reach out to events@tibcomeetings.com and provide proof of your involvement with an educational institution or send us a copy of your student ID. 

Let’s block ads! (Why?)

The TIBCO Blog

Read More

The Future Plant: Current Challenges In Asset Performance

May 12, 2020   SAP
 The Future Plant: Current Challenges In Asset Performance

Part 1 of a three-part series

When Horst started his work as a machine technician at a manufacturing plant 20 years ago, asset management looked very different than it looks today. Having climbed up the career ladder to become an asset manager, Horst has created a modern maintenance environment that tackles many of the major problems German manufacturing companies are concerned with.

Horst no longer has to do a daily tour through the plant to note downed-machine issues or check on maintenance-due dates. Instead, Horst uses asset management software that provides him a constant overview of all assets, right from his desk. Every asset is digitally represented by its digital twin and can permanently be monitored via a visual display.

By continuously collecting relevant data, designated devices automatically enrich an asset’s digital twin with information about its current performance and condition. Data analytics algorithms can use this information to generate a set of relevant KPIs throughout each asset’s entire lifecycle.

For Horst, it is crucial to always be prepared for any possible machine breakdown. Therefore, he is especially interested in knowing an asset’s mean time to failure (MTTF) or mean time between failures (MTBF), as well as the frequency of these incidents.

Knowing particular failures, and how often they typically occur with certain assets, helps Horst classify machine problems to common failure modes and get an understanding of when a failure is likely to happen. It also supports him in grouping his assets into certain risk categories depending on how often, how severe, and how detectable failures are occurring with an asset. This entire process is called Failure Mode Analytics – an important analysis for strategic asset management that is strongly enabled by the ability to monitor each asset’s performance.

Two other important KPIs are relevant once a predicted failure occurs: mean time to repair (MTTR) and mean downtime. As the main measures of machine availability, these KPIs are supposed to be relatively low to enable a maximum level of production continuity.

Following the principles of lean management, Horst is constantly engaged in putting appropriate measures in place to reduce the time a machine is down for repair. In this context, respective breakdown costs also play a meaningful role in managing asset performance.

Last, but not least, an asset’s comprehensive performance can be evaluated in the Overall Equipment Effectiveness KPI. This KPI indicates the percentage of time in which an asset is producing only good parts (quality) as fast as possible (performance) with no stop time (availability). Combining the aspects of quality, performance, and availability makes this measure a very powerful tool for Horst in assessing his assets and in gaining data-based knowledge about his overall plant productivity.

The variety of different KPIs makes it possible to have continual, real-time insight into all assets and their performance. For Horst, who always needs to have a profound overview of his assets’ current state, this really makes life easier. More importantly, the asset performance software equips him with a reliable base for decision-making.

While in the past, most decisions were made based on gut feeling, today the digital twin and its KPIs serve as the source for making machine diagnoses and determining asset maintenance routines. Also, standardized KPIs allow comparisons between several groups of assets or across different plants. This makes processes more transparent and more reliable, therefore helping Horst achieve the best possible asset operation.

By enabling technologies for the smart factory, companies are achieving Mission Unstoppable: making facilities management a transparent, manageable process.

Let’s block ads! (Why?)

Digitalist Magazine

Read More

Beyond Limits inks $25 million strategic deal with Xcel to build AI-controlled power plant

July 10, 2019   Big Data
 Beyond Limits inks $25 million strategic deal with Xcel to build AI controlled power plant

Beyond Limits, a Glendale, California-based company developing explainable AI for a range of industries, today announced a $ 25 million strategic partnership with Xcel Energy to build what it’s calling the world’s first AI-controlled power plant. Beyond Limits CEO AJ Abdallat said it’ll be designed from the ground up and from scratch, and that it’ll embed intelligence into the “entire operation” to make the plant “safer” and maximize efficiency, productivity, and environmental friendliness.

“We’re excited about it,” said Abdallat at VentureBeat’s Transform 2019 conference in San Francisco. “State of the art today is predicting whether a device is going to fail. We do all of that, and that’s wonderful, but [we also have a] layer on top [that allows us to] align with the objectives of the company managing [the facility] so they can build a system that oversees everything going on.”

It’s an approach akin to that taken by Foxconn-backed Carbon Relay, which aims to cut datacenter carbon emissions with tools that draw on thousands of sensors to make predictions about electrical usage. Similarly, Google parent company Alphabet’s DeepMind said in February that it’s using AI to forecast the performance of wind turbines 36 hours in advance, and last August, DeepMind revealed that it’s turned over management of one of Google’s datacenter cooling controls to an AI-powered recommender system

“When you make a change in one portion of [a facility], whether you like it or not, you’re going to change things throughout the system. You can’t ever hit pause to try and figure out what’s going on,” said Beyond Limits technical head of industrial IoT Kristin Lennox, who demoed the company’s system onstage as applied to an oil refinery. “The stakes are high. If you make a bad decision, there are potentially dire consequences in terms of health and safety.”

That’s why Beyond Limits champions what it calls “cognitive AI,” a paradigm that entails ingesting data from monitoring devices to create a picture of what’s going on at a given asset. A machine learning system detects anomalies in part with the aid of a second layer — a “cognitive” layer — that includes facility goals, and it provides a full trace of all data and all steps used to arrive at its conclusions. Simultaneously, the system evaluates situations to identify where it’s not achieving baseline goals.

“In addition to constantly evaluating operations and trying to figure out how we make those as good as they can be, we’re simultaneously … reporting back to engineers the factors that are driving [shortcomings] and the ways they [can be addressed],” said Lennox.

Beyond Limits was cofounded in 2014 by Abdallat and Mark James as a full-stack engineering company creating AI solutions deployable via the cloud, on-premises, or embedded in devices at the edge. In 2012, the California Institute of Technology granted the company a license to update and commercialize 40 of the AI programs developed through NASA’s Deep Space program and operated out of Caltech’s Jet Propulsion Laboratory (JPL).

One of the aforementioned programs is Spacecraft Health Inference Engine (SHINE), a software-defined knowledge-based system designed to be efficient enough to operate in real-time environments and used by non-LISP apps written in programming languages like C and C++. It’s used at JPL, where it’s been applied to AI research and specialized tools running on distributed systems.

“When you’re operating in space, you’re operating in a very harsh and dynamic environment,” explained Abdallat. “You don’t have access to the computing power — you don’t have access to Amazon’s, Microsoft’s, or Google’s amazing cloud. And you’re going to have gaps in the communication — you’re going to have issues with the data. As a result, [NASA] created a world-class organization to focus on how to … allow these missions to [succeed] on [long] journeys.”

Today, Beyond Limits’ suite includes a management adviser for energy operators that integrates with platforms for physics-based, geology, and policy elements, and health care products that combine conventional AI techniques with symbolic reasoning to analyze patient data, lab results, chart notes, and drug interactions and proactively alert medical staff about deteriorating patients. Meanwhile, in the automotive sector, the company’s systems can predict if drivers have become distracted from safe driving and intervene.

To date, Beyond Limits has raised over $ 25 million in venture capital from BP Ventures and other backers. BP, along with Mitsubishi, is among its clients.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Plant Tours In The 21st Century: How Procurement Executives Can Stay Connected

June 16, 2019   SAP
 Plant Tours In The 21st Century: How Procurement Executives Can Stay Connected

Part four of a six-part blog series based on 30+ years’ experience collaborating on innovation with complex, discrete machinery manufacturers. 

In the third blog in this series, I discussed how Industry 4.0 and the Industrial Internet of Things (IIOT) have facilitated greater insight for manufacturing executives. Now I want to spend some time examining how procurement executives can gain greater insight into internal operations and partner with suppliers to achieve true innovation.

If you’ve been following this series (thank you!), you know that my motivation for writing these blogs was an article in the Harvard Business Review, “Why (and How) to Take a Plant Tour,” written by David M. Upton, professor of operations management at the University of Oxford’s Saïd Business School. I found it fascinating to see how much things have changed since the time he wrote the article in May 1997.

Professor Upton wrote:

“Plant visits allow managers to review a supplier’s qualifications, to share best practices with a partner, or to benchmark performance and practices …. The main purpose of assessment tours is not to acquire new knowledge. Rather, it is to use what visitors already know to evaluate a plant. There are a number of different types of assessment tours. Some aim to determine whether a plant can fill a particular role. For example, a customer may visit a potential supplier to assess quality, or a corporate planner may visit a plant to decide if it could develop the ability to fill orders quickly enough to support the company’s new strategy. Other assessment tours focus less on a plant’s existing capabilities and more on how that plant might be changed to perform better or differently in the future.”

During the earliest days of industry, manufacturers found comfort in doing business with the shop across town. They believed that doing business with a long-time, trusted local source was the way to ensure both quality and cost, as well as to build a stronger community. In time, however, it became not only possible but financially desirable to procure supplies from around the globe, which in turn required those plant tours that Professor Upton was talking about.

In 1997, the term “procurement” was virtually synonymous with “purchasing,” and procurement managers generally worked under the direction of manufacturing engineers, supply chain executives, and finance officers. At the time, the best way to evaluate a supplier, as Professor Upton noted, was probably boots on the ground.

By contrast, procurement today is a far more complex and dynamic process, and it’s corporate malfeasance to do business with someone just because they’re the shop across town, a fraternity brother, or a golfing buddy. Procurement managers now may have to work with hundreds of suppliers from around the world, adding the complexity of ever-shifting concerns of shipping, international law, tariffs, currency exchange, and more.

Procurement specialists no longer work at the direction of others, but rather with them to find the most suitable suppliers. It’s not uncommon for the procurement specialist to initiate an innovation and bring it to the attention of the engineers or supply chain executives.

Procurement managers are also crucial in a crisis. Whether it’s a product recall, a humanitarian crisis, civil unrest, or a natural disaster, it may be critical that procurement responds without flinching. It could be that a supplier is unexpectedly unable to fulfill orders, or there could be an unanticipated surge in demand that must be quickly accommodated. Procurement specialists may have to identify and contract with new suppliers urgently. Better yet, they can proactively identify the lowest risk suppliers, such as those with a resilient and sustainable infrastructure, to avoid a crisis in the first place. Being able to accommodate that level of procurement requires a truly connected intelligent enterprise, one that reaches out to engineering, supply chain, finance, manufacturing, legal, marketing, and the highest levels of management.

Another benefit of digital procurement is the ability to turn a profit. It’s ironic that a business division whose central purpose is to spend money could be profitable, but it’s true. Procurement can sell data back to suppliers, who could in turn use this additional information to create superior, cost-efficient products. While it’s intuitive that suppliers should seek the maximum amount of profit from their customers, smart suppliers understand that when their customers succeed, they succeed.

Without question, there are obvious risks and liabilities in sharing information with suppliers. Not only are there competitive concerns, but legal issues as well – for example, who actually owns the data? Nonetheless, most procurement executives agree that working closely with suppliers to share information and innovate may be well worth the risk.

However, as the old saying goes, if you want to have a good friend you need to be a good friend. According to the Deloitte Global Chief Procurement Officer Survey 2018:

“Last year, 86% of procurement leaders aspired to being ‘excellent’ as a strategic business partner in the future. In 2018, only 24% consider themselves as excellent: although this is a slight improvement from 2017, it highlights the need for further improvement in business partnering by procurement teams.”

Only the best-run businesses will be excellent strategic business partners, and the best way of winning the confidence of your suppliers is to have a thoroughly integrated procurement solution. It must have an efficient digital process for sourcing both indirect and direct materials, it must work for all spend categories, and it must manage the entire source-to-contract process from integrating new bills of materials from product lifecycle management systems through sourcing, contract management, and manufacturing execution integration. It is possible. To get a glimpse of how it works, take a look at this short video.

Back in 1997, Professor Upton could not have envisioned the strategic importance procurement plays today, but it’s remarkable that not long ago the most thorough way to assess a supplier was to take a quick tour of the plant. On the other hand, Professor Upton did foresee the challenge of hiring and retaining qualified industrial manufacturing talent, and that’s the topic of my next blog.

For more insight about plant tours in the 21st century, stay tuned to this six-part blog series based on my 30+ years’ experience collaborating on innovation with complex, discrete machinery manufacturers.

Let’s block ads! (Why?)

Digitalist Magazine

Read More

Medieval depictions of the mandrake plant

December 31, 2017   Humor

[unable to retrieve full-text content]

tumblr p05yaeQbS81sxq35qo2 500 Medieval depictions of the mandrake plant

tumblr p05yaeQbS81sxq35qo5 500 Medieval depictions of the mandrake plant

tumblr p05yaeQbS81sxq35qo4 500 Medieval depictions of the mandrake plant

tumblr p05yaeQbS81sxq35qo3 500 Medieval depictions of the mandrake plant

tumblr p05yaeQbS81sxq35qo6 500 Medieval depictions of the mandrake plant

Medieval depictions of the mandrake plant

A Historian Walks into a Bar . . .

Read More

Xiaomi Builds Second Manufacturing Plant In India

April 3, 2017   Mobile and Cloud

Chinese smartphone maker Xiaomi has cooperated with Foxconn in the building of its second Indian manufacturing plant, which is located in Sri City, Andhra Pradesh.

Xiaomi entered the Indian market in 2014 and the company started local production one year after. Xiaomi’s first plant in India is not exclusive and Foxconn also makes phones for other brands. However, this new plant is exclusively for Xiaomi.

Xiaomi did not disclose financial details about this project, but said they are able to produce one phone per second in India. Manu Jain, head of Xiaomi India, said that their two plants in India will offer 5,000 jobs in total and 90% of the employees are female workers.

Xiaomi’s two Indian plants will be able to meet 95% of the demand in the local market. However, they still need to import parts and other high-end products such as Mi 5 components from China.

Statistics provided by the market research firm IDC revealed that by the end of 2016, Xiaomi held 10.7% market share in India and the company was only second to the Korean smartphone maker Samsung, which held 25% market share.

Let’s block ads! (Why?)

ChinaWirelessNews.com

Read More

Xiaomi Builds Second Manufacturing Plant In India

April 3, 2017   Mobile and Cloud

Chinese smartphone maker Xiaomi has cooperated with Foxconn in the building of its second Indian manufacturing plant, which is located in Sri City, Andhra Pradesh.

Xiaomi entered the Indian market in 2014 and the company started local production one year after. Xiaomi’s first plant in India is not exclusive and Foxconn also makes phones for other brands. However, this new plant is exclusively for Xiaomi.

Xiaomi did not disclose financial details about this project, but said they are able to produce one phone per second in India. Manu Jain, head of Xiaomi India, said that their two plants in India will offer 5,000 jobs in total and 90% of the employees are female workers.

Xiaomi’s two Indian plants will be able to meet 95% of the demand in the local market. However, they still need to import parts and other high-end products such as Mi 5 components from China.

Statistics provided by the market research firm IDC revealed that by the end of 2016, Xiaomi held 10.7% market share in India and the company was only second to the Korean smartphone maker Samsung, which held 25% market share.

Let’s block ads! (Why?)

ChinaWirelessNews.com

Read More

Hampton Creek’s data scientists team up with chefs to find the holy grail of plant proteins

November 29, 2014   Big Data

At Facebook and LinkedIn, data scientists analyze site usage and develop features that take user information into consideration. Inside the cramped office of Hampton Creek, a 3-year-old San Francisco startup, data scientists are doing something wholly different, and arguably more impactful.

They’re weeding out billions of proteins from hundreds of thousands of plants to figure out what could form the basis of a vegan equivalent of an egg. They propose the testing of a given protein. Then, glorified chefs test it out with a few other ingredients. Drawbacks get sent upstairs to the data scientists, who block out the protein in question and others related to it. And Hampton Creek thereby inches closer to its noble goal.

The startup has succeeded in landing its plant-based “mayo” and cookie dough in Walmart and Whole Foods stores all over the U.S., and it’s considered international expansion. As such, Hampton Creek has caught attention alongside other startups with products that people can buy and eat, like Beyond Meat and Soylent.

But that ignores Hampton Creek’s potential. It could weave its products into huge food-product operations, which could lower their carbon emissions — and their sourcing costs to boot.

That’s where the computers come in.

“Our slug is, we apply deep machine learning to plant biological data,” Lee Chae, Hampton Creek’s head of research and development, explained to VentureBeat in an interview at the startup’s office. Indeed, Chae said, Hampton Creek has explored the use of deep learning, a type of artificial intelligence, to meet its objectives.

The recipe calls for servers

Chae believes that, with its data scientists, food scientists, biochemists, chefs, and its own on-premises servers, the startup can perform an extraordinary amount of research, especially given its size.

“We’ve talked to major food companies out there that have multi-, multimillion-dollar research budgets,” Chae said. Oftentimes, he said, the work revolves around product development, which can bring in revenue.

“There’s less to the ‘R’ side, the research side,” of R & D, or research and development, Chae said.

Hampton Creek, for its part, has the feel of a genuine lab, where rows of workbenches with sophisticated machinery and ingredient containers take up much of the office space.

Follow these 4 steps

Everything happens in accord with an elaborate process that involves humans and machines alike:

  • Run proteins through assays.
  • Quantify molecular and food-related properties of the proteins.
  • Put the proteins into a food-model system and get metrics of how they will perform — like whether they will produce foam or bind with water, for instance.
  • Determine the performance of related proteins with predictive models.

The predictive models in particular represent a sort of strategic secret sauce for Hampton Creek.

“That way, we learn what properties are really meaningful for performance, and we can reduce that search space by just focusing on those properties and increase our hit rate,” Chae said. “Then we can search that space intelligently and efficiently — unlike any other company out there that has not developed this technology.”

Enter chefs

But making emulsions with proteins and ultimately putting each of them in a pan — which employees can do several times per day — will remain an essential step. Because after all, people will be preparing food with these proteins, not just staring at numbers about them in a spreadsheet.

And that’s why there are real chefs, like Ben Roche and Chris Jones, who bear the titles research and development chef and director of culinary innovation, respectively. Their work is manual, but it does get turned into data.

Hampton Creek eggs 3 Novet Hampton Creek’s data scientists team up with chefs to find the holy grail of plant proteins

Jones, who spent several years as chef de cuisine at the Moto Restaurant in Chicago before moving to San Francisco to work for Hampton Creek, prefers good old notebooks to store his observations about proteins. Jones also writes out brief strings of letters and numbers about specific proteins in marker on a whiteboard, which the company had thrown a dark cover over when I stopped by. The real-world figures eventually get transferred into databases.

Jones knows full well that his impressions of proteins are not very precise. It would be more accurate to describe them as subjective, or maybe informed, following years of experience tasting foods. But Jones knows his routine. He doubts his taste buds will get thrown off very much. And so his very human mouth remains a crucial component of an otherwise very high-tech startup.

Judging by one indicator, the balance between servers and chefs might be just right.

“There’s a reason why all these food companies are dropping us inquiries about partnering to use our technology,” Chae said.


VentureBeat is studying mobile marketing automation. Chime in, and we’ll share the data.

This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.

VentureBeat » Big Data News | VentureBeat

Read More

The world’s largest solar power plant is now up and…

November 28, 2014   BI News and Info

The world’s largest solar power plant is now up and running
Jon Fingas, engadget.com

Solar power just hit one of its biggest milestones, in more ways than one. First Solar recently finished building Topaz, a 550-megawatt plant that represents the largest active solar farm on the planet. And we do mean large — the installation’s n…

This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.

A Smarter Planet

Read More
  • Recent Posts

    • Accelerate Your Data Strategies and Investments to Stay Competitive in the Banking Sector
    • SQL Server Security – Fixed server and database roles
    • Teradata Named a Leader in Cloud Data Warehouse Evaluation by Independent Research Firm
    • Derivative of a norm
    • TODAY’S OPEN THREAD
  • Categories

  • Archives

    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited