Tag Archives: into

Bridge Morphs Into Trampoline

 Bridge Morphs Into Trampoline

Unexpected bouncing joy.

“It will be fun they said.”
Image courtesy of https://imgur.com/gallery/DHnNz8D

Advertisements

Let’s block ads! (Why?)

Quipster

Why Pour Your Tax Savings Into Digital Transformation?

 Why Pour Your Tax Savings Into Digital Transformation?

Part 2 of a 2-part series. Read Part 1.

In my last blog, I discussed the ways in which corporations are responding to the increased after-tax cash flow resulting from the new tax law here in the United States. These responses run the gamut from dividend increases and share buybacks to one-time employee bonuses and debt paydown.

But have you considered long-term benefits? Digital transformation can be the right move, at least if your goal is to generate long-term value for your business.

But what, exactly, is digital transformation?

Think of digital transformation as business reinvention. The goal is to improve business performance and transform the way you serve your customers using emerging digital technologies – such as Big Data, analytics, cloud computing, Internet of Things (IoT), machine learning, blockchain, and more.

The crux of any digital transformation is data. But data has been around forever. What’s new in the digital economy is the ability to use data to connect everything – people, devices, and business networks – in real time. With powerful technologies and platforms to manage these connections, we can now develop new services, new products, and new business models that deliver better outcomes for customers.

A nondisruptive framework for disruption

But, as I asked in my last blog, how do you move forward? The problem that most companies face when it comes to change is how to keep the lights on while transforming. If your goal is to disrupt an industry or two with leading-edge innovation, how do you do it without disrupting your own company?

The first step, frankly, is to accept the fact that change is always a constant – as clichéd as it may sound. Make your peace with change. Expect it. Encourage it. No digital transformation effort is a one-and-done scenario. Rather, it’s an ongoing effort. In fact, the platforms we build as part of our transformation efforts are themselves designed to facilitate change. Change is built into the design. The best response is to embrace it.

The second step is to start thinking in terms of two-lane IT (also called “bimodal IT”), a nondisruptive framework for disruption. In one lane, you have core IT systems to support core operations. Sometimes this is referred to as your system of record, which is critical to maintain. Part of a digital transformation journey should focus on optimizing processes and operations regarding the system of record.

In the other lane, you establish a system of innovation, which gives you the flexibility to develop new digital applications – and new business models – quickly. They range from IoT-enabled devices that track customer usage and charge per usage (rather than selling the machines as products) to machine learning algorithms that tell you more about your customers. Because most new applications in this lane will need to plug into data maintained in the other lane, the two are integrated. This system of innovation uses critical core data from operations to deliver new value for your business and your customers.

Think big but start small

As we dive into digital transformation, we should all think big. But when the rubber hits the road, most of us will start small with easy wins in the slower lane and move up the value chain. One model goes like this:

  • Better IT: Focus on cloud capabilities, getting right with your data, and optimizing systems.
  • Better decisions: Increase transparency, improve decision support, and automate decision-making where possible.
  • Better processes: Collapse cycle times, increase process flexibility, increase collaboration, build predictive capabilities.
  • Better products and services: Digitalize and optimize your offerings – and create new offerings that were previously untenable.
  • Better business models: Innovate with flexibility and devise outcome-based business models that help you enter new markets, attract new customers, and deliver the value that helps retain customers.

One study by the authors of “Leading Digital” (George Westerman, Didier Bonnet, and Andrew McAfee) finds that companies with stronger digital capabilities are better at driving revenue with their physical assets. Strong digital companies are 26% more profitable than their industry peers and generate nine percent more revenue from their assets. Beginners in the realm of digital transformation trail their competitors by four percent in revenue-generation efficiency and 24% in profitability.

While it might make sense for companies to use their increased after-tax cash flow for a wide range of laudable purposes, here at SAP, we’re dedicated to helping those that want to achieve long-term benefits through digital transformation. There’s a lot of opportunity out there – and companies with the capacity to change can seize it.

Gather more insight on Why Strategic Plans Need Multiple Futures.

Follow SAP Finance online: @SAPFinance (Twitter) | LinkedIn | Facebook | YouTube

Let’s block ads! (Why?)

Digitalist Magazine

Expert Interview (Part 1): Human Centered Design and Elise Roy on Transforming Disability into Innovation

Elise Roy says that losing her hearing when she was 10 years old has been one of the greatest gifts she’s ever received.

Early on, she viewed her loss as something she had to deal with and overcome. That perspective has shifted though.

“My disability has become an asset,” Elise says. “Rather than something I have to deal with, it’s a tool.”

Expert Interview Part 1 Human Centered Designer Elise Roy on Transforming Disability into Innovation banner Expert Interview (Part 1): Human Centered Design and Elise Roy on Transforming Disability into Innovation

A tool Elise has leveraged in just about every job she’s taken on – as one of the country’s few deaf lawyers, as an artist and designer, and as a human rights activist.

Most recently, she’s started working as a consultant, using her unique perspective to help organizations take a different approach to their design practices. Her goal is to show the groups she works with that incorporating a deeper understanding of how the disabled navigate the world will lead to extraordinary innovation and results.

“I believe that these unique experiences that people with disabilities have is what’s going to help us make and design a better world … both for people with and without disabilities,” she shared in her TED talk.

She consults through the lens of Human Centered Design, trying to develop the best product by defining problems and understanding constraints, observing people in real-world situations, asking questions and then using prototyping to test it quickly and cheaply all while keeping the end users– the customers in focus.

Elise learned first-hand how effective this method of problem-solving is back when she was taking a fabrication class in art school. The tools she was using for woodworking would sometimes kick back at her. Generally, before doing this they would emit a sound. But because of her hearing loss, Elise wasn’t able to hear it. In response, she developed a pair of safety goggles that give a visual warning when the pitch of the machine changed. The product can help protect both those who are hearing impaired and those with no hearing loss.

She points to other widely used inventions that were initially created for people with a disability, too. Email and text messaging, for instance, were designed for deaf users.

The OXO potato peeler was designed to help individuals with arthritis but was adopted by the general population because of how comfortable it is to use. There are tech companies currently developing apps and websites who are looking to people with dyslexia and intellectual disabilities for inspiration on simplifying design and offering an easier-to-use interface for everyone.

Check back for part 2 where Elise goes more in depth with what she is doing with Human Centered Design.

Also, we have a new eBook focused on Strategies for Improving Big Data Quality available for download.

Let’s block ads! (Why?)

Syncsort Blog

How IKEA Builds Sustainable Innovation Into Its Business Model To Improve Lives

 How IKEA Builds Sustainable Innovation Into Its Business Model To Improve Lives

Most people think of a discarded plastic bottle as waste. But at IKEA, it’s a resource.

The IKEA brand has a set of commandments for doing business, and not wasting resources is one of them.

For example, every year, about 100 billion plastic water bottles are used worldwide, but only 30 percent are recycled, the rest ending in landfills or polluting the ocean. The furniture giant is committed to proving that recycled plastic can be used in the large-scale production of household goods. One example is a new line of kitchen fronts made of plastic and reclaimed industrial wood. The result is a product line that’s not only durable and beautiful, it’s sustainable.

Thanks to some of IKEA’s other business commandments, such as thinking differently and taking responsibility, the company is showing the world how a circular economy can function at scale in every part of their business.

“We’re even looking into circular solutions for our hardware equipment so it won’t end up in a landfill,” says Kristin Grimsdottir. As sustainability manager at operations & shared services at IKEA Group, she is responsible for a team that runs and enables sustainable IT solutions for IKEA Group.

Democratic design

When asked to describe IKEA’s vision for the future at the recent ThinkX event in Stockholm co-sponsored by SAP and Singularity University, Kristin Grimsdottir responds with passion.

“We are not merely a home furnishing company; we focus on life at home and how we can make it better for people. For instance, we’re already helping customers generate their own energy with home solar panels and battery storage options and exploring the area of urban organic farming so you can grow your own food in your kitchen,” she explains.

It is one of IKEA’s core beliefs that everyone has a right to a better everyday life. IKEA’s business idea is to offer well-designed furniture at an affordable price for the many people. One of the big movements going forward is about becoming even more affordable so that many more people can enjoy a better life at home– without compromising on sustainability, quality, or design.

This is possible thanks to the company’s principles of Democratic design. For every new product, the design team first sets the price and then works from there to create functional, attractive, high-quality items from sustainable materials.

Purpose-driven growth

A circular IKEA that reuses or recycles all materials is one way to prepare for the future, another is to drive efficiency through digitalization.

Clearly, there is no lack of innovation in the company. What’s missing is the seamless experience for customers that is a must in the digital world.

While IKEA is actively rolling out its e-commerce solution, Grimsdottir admits that they still have areas of improvement on the e-commerce front. But IKEA is embracing digital technologies elsewhere too. “For example, we’ve implemented IKEA Place, an augmented reality app that helps you decorate your home virtually”, she says.

For IKEA, continued growth requires the transformation of business and IT to implement a more modern IT landscape, develop advanced analytics capabilities and implement more efficient end to end processes. But more importantly, it also requires full buy-in from employees.

“Change is the new normal,” says Grimsdottir, “so it’s important that all of us try to embrace it. We are not implementing automation technology/AI in order to get rid of people but to streamline processes in order to reduce waste and increase efficiency and precision. It is important to be better to meet our customers’ expectations. And that gives our co-workers the opportunity to grow and develop more human, less robotic skills that are believed to be even more critical in the future.”

Every company has a purpose. For IKEA, it’s about creating a better everyday life for the many people without compromising on price, form, function, quality, or the environment.

After all, as IKEA founder Ingvar Kamprad said, “To design a desk which may cost $ 1,000 is easy for a furniture designer, but to design a functional and good desk which shall cost $ 50 can only be done by the very best.”

This story also appears on SAP Innovation Spotlight.

Let’s block ads! (Why?)

Digitalist Magazine

WATCH: SPIDER-MAN: INTO THE SPIDER-VERSE, Starring Shameik Moore, Hailee Seinfeld, And Mahershala Ali

SpiderVerse WATCH: SPIDER MAN: INTO THE SPIDER VERSE, Starring Shameik Moore, Hailee Seinfeld, And Mahershala Ali
The official trailer for Sony Pictures Animation’s SPIDER-MAN: INTO THE SPIDER-VERSE, starring Shameik Moore, Hailee Seinfeld, Mahershala Ali and more has been released! 
Watch the trailer below:

WATCH: Dave Chappelle Featured In New 'Star Is Born' Trailer Starring Lady Gaga, Bradley Cooper

Let’s block ads! (Why?)

The Humor Mill

Bringing Errant Kitten Back Into The Fold

 Bringing Errant Kitten Back Into The Fold

Time out until back in the crib.

“Parenting is hard.”
Image courtesy of https://imgur.com/gallery/YjEOVCp.

Advertisements

Let’s block ads! (Why?)

Quipster

Importing JSON Collections into SQL Server

It is fairly easy to Import JSON collections of documents into SQL Server if there is an underlying ‘explicit’ table schema available to them. If each of the documents have different schemas, then you have little chance. Fortunately, schema-less data collections are rare.

In this article we’ll start simply and work through a couple of sample examples before ending by creating a SQL server database schema with ten tables, constraints and keys. Once those are in place we’ll then import a single JSON Document, filling the ten tables with the data of 70,000 fake records from it.

Let’s start this gently, putting simple collections into strings which we will insert into a table. We’ll then try slightly trickier JSON documents with embedded arrays and so on. We’ll start by using the example of sheep-counting words, collected from many different parts of Great Britain and Brittany. The simple aim is to put them into a table. I don’t use Sheep-counting words because they are of general importance but because they can be used to represent whatever data you are trying to import.

You will need access to SQL Server version, 2016 and later or Azure SQL Database or Warehouse to play along and you can download data and code from GitHub.

Converting Simple JSON Arrays of Objects to Table-sources

We will start off by creating a simple table that we want to import into.

We then choose a simple JSON Format

We can very easily use OpenJSON to create a table-source that reflects the contents.

Once you have a table source, the quickest way to insert JSON into a table will always be the straight insert, even after an existence check. It is a good practice to make the process idempotent by only inserting the records that don’t already exist. I’ll use the MERGE statement just to keep things simple, though the left outer join with a null check is faster. The MERGE is often more convenient because it will accept a table-source such as a result from the OpenJSON function. We’ll create a temporary procedure to insert the JSON data into the table.

Now we try it out. Let’s assemble a couple of simple JSON strings from a table-source.

Now we can EXECUTE the procedure to store the Sheep-Counting Words in the table

Check to see that they were imported correctly by running this query:

word image Importing JSON Collections into SQL Server

Converting to Table-source JSON Arrays of Objects that have Embedded Arrays

What if you want to import the sheep-counting words from several regions? So far, what we’ve been doing is fine for a collection that models a single table. However, real life isn’t like that. Not even Sheep-Counting Words are like that. A little internalized Chris Date will be whispering in your ear that there are two relations here, a region and the name for a number.

Your JSON for a database of sheep-counting words will more likely look like this (I’ve just reduced it to two numbers in the sequence array rather than the original twenty). Each JSON document in our collection has an embedded array.

After a bit of thought, we remember that the OpenJSON function actually allows you to put a JSON value in a column of the result. This means that you just need to CROSS APPLY each embedded array, passing to the ‘cross-applied’ OpenJSON function the JSON fragment representing the array, which it will then parse for you.

I haven’t found the fact documented anywhere, but you can leave out the path elements from the column declaration of the WITH statement if the columns are exactly the same as the JSON keys, with matching case.

The ability to drill into sub-arrays by cross-joining OpenJSON function calls allows us to easily insert a large collection with a number of documents that have embedded arrays. This is looking a lot more like something that could, for example, tackle the import of a MongoDB collection as long as it was exported as a document array with commas between documents. I’ll include, with the download on GitHub, the JSON file that contains all the sheep-counting words that have been collected. Here is the updated stored procedure:

We can now very quickly ingest the whole collection into our table, pulling the data in from file. We include this file with the download on GitHub, so you can try it out. There are thirty-three different regions in the JSON file

We can now check that it is all in and correct

Giving …

word image 1 Importing JSON Collections into SQL Server

Just as a side-note, this data was collected for this article in various places on the internet but mainly from Yan Tan Tethera. Each table was pasted into Excel and tidied up. The JSON code was created by using three simple functions, one for the cell-level value, one for the row value and a final summation. This allowed simple adding, editing and deleting of data items. The technique is only suitable where columns are of fixed length.

Importing a More Complex JSON Data Collection into a SQL Server Database

We have successfully imported the very simplest JSON files into SQL Server. Now we need to consider those cases where the JSON document or collection represents more than one table.

In any relational database, we can use two approaches to JSON data, we can accommodate it, meaning we treat it as an ‘atomic’ unit and store the JSON unprocessed, or we can assimilate it, meaning that we turn the data into a relational format that can be easily indexed and accessed.

  • To accommodate JSON, we store it as a CLOB, usually NVARCHAR(MAX), with extra columns containing the extracted values for the data fields with which you would want to index the data. This is fine where all the database has to do is to store an application object without understanding it.
  • To assimilate JSON, we need to extract all the JSON data and store it in a relational form.

Our example represents a very simple customer database with ten linked tables. We will first accommodate the JSON document by creating a table (dbo.JSONDocuments) that merely stores, in each row, the reference to the customer, along with all the information about that customer, each aspect (addresses, phones, email addresses and so on) in separate columns as CLOB JSON strings.

We then use this table to successively assimilate each JSON column into the relational database.

This means that we need parse the full document only once.

To be clear about the contents of the JSON file, we will be cheating by using spoof data. We would never have unencrypted personal information in a database or a JSON file. Credit Card information would never be unencrypted. This data is generated entirely by SQL Data Generator, and the JSON collection contains 70,000 documents. The method of doing it is described here.

We’ll make other compromises. We’ll have no personal identifiers either. We will simply use the document order. In reality, the JSON would store the surrogate key of person_id.

The individual documents will look something like this

We will import this into a SQL Server database designed like this:

word image 2 Importing JSON Collections into SQL Server

The build script is included with the download on GitHub.

So, all we need now is the batch to import the JSON file that contains the collection and populate the table with the data. We will now describe individual parts of the batch.

We start out by reading the customersUTF16.json file into a variable.

The next step is to create a table at the document level, with the main arrays within each document represented by columns. (In some cases, there are sub-arrays. The phone numbers, for example, have an array of dates.) This means that this initial slicing of the JSON collection needs be done only once. In our case, there are

  • The details of the Name,
  • Addresses,
  • Credit Cards,
  • Email Addresses,
  • Notes,
  • Phone numbers

We fill this table via a call to openJSON. By doing this, we have the main details of each customer available to us when slicing up embedded arrays. The batch is designed so it can be rerun and should be idempotent. This means that there is less of a requirement to run the process in a single transaction.

Now we fill this table with a row for each document, each representing the entire date for a customer. Each item of root data, such as the id and the customer’s full name, is held as a column. All other columns hold JSON. This table will be an ‘accomodation’ to the JSON data, in that each row represents a customer, but each JSON document in the collection is shredded to provide a JSON string that represents the attributes and relations of that customer. We can now assimilate this data step-by-step

First we need to create an entry in the person table if it doesn’t already exist, as that has the person_id. We need to do this first because otherwise the foreign key constraints will protest.

Now we do the notes. We’ll do this first because it is a bit awkward. This has the complication because there is a many to many relationship with the notes and the people, because the same standard notes can be associated with many customers such an overdue invoice payment etc. We’ll use a table variable to allow us to guard against inserting duplicate records.

Addresses are complicated because they involve three tables. There is the address, which is the physical place, the abode, which records when and why the person was associated with the place, and a third table which constrains the type of abode. We create a table variable to support the various queries without any extra shredding.

Credit cards are much easier since they are a simple sub-array.

Email Addresses are also simple. We’re on the downhill slopes now.

Now we add these customers phones. The various dates for the start and end of the use of the phone number are held in a subarray within the individual card objects. That makes things slightly more awkward

Conclusion

JSON support in SQL Server has been the result of a long wait, but now that we have it, it opens up several possibilities.

No SQL Server Developer or admin needs to rule out using JSON for ETL (Extract, Transform, Load) processes to pass data between JSON-based document databases and SQL Server. The features that SQL Server has are sufficient, and far easier to use than the SQL Server XML support.

A typical SQL Server database is far more complex than the simple example used in this article, but it is certainly not an outrageous idea that a database could have its essential static data drawn from JSON documents: These are more versatile than VALUE statements and more efficient than individual INSERT statements.

I’m inclined to smile on the idea of transferring data between the application and database as JSON. It is usually easier for front-end application programmers, and we Database folks can, at last, do all the checks and transformations to accommodate data within the arcane relational world, rather than insist on the application programmer doing it. It will also decouple the application and database to the extent that the two no longer would need to shadow each other in terms of revisions.

JSON collections of documents represent an industry-standard way of transferring data. It is today’s CSV, and it is good to know that SQL Server can support it.

Let’s block ads! (Why?)

SQL – Simple Talk

Lessons from Amazon’s Entry into B2B

Posted by Kristin Swenson, Wholesale Distribution Industry Marketing Lead

Amazon’s rise to power is no secret – but recently, Amazon has proven that its influence goes beyond just the B2C world. Through Amazon Business, the company has successfully entered the B2B marketplace. This move is having a profound impact on manufacturers, distributors and retailers alike.

In a recent webinar From B2B to B2C: Amazon’s Entry into the B2B World,” NetSuiteGettyImages 933763188 Lessons from Amazon’s Entry into B2B hosted Colin Puckett, Amazon Business’s Head of Seller Marketing, along with distribution industry leader Scott Costa, publisher of tED Magazine and ecommerce expert Brian Beck, SVP, Ecommerce and Omnichannel Strategy at Guidance, a retail services business for a discussion on Amazon Business. Here are the key takeaways.

Three Approaches to Amazon Business

The panelists categorized manufacturers, distributors and retailers into three groups as it relates to Amazon Business:

  • Those who are willing to partner with Amazon Business to expand their reach into new markets or acquire new customers using Amazon’s established channels.
  • Those who view Amazon Business as a direct competitor – this group tends to lag behind when it comes to an ecommerce strategy – either they do not have their own website, or their online experience is difficult to navigate.
  • Those who recognize the impact of Amazon Business and are choosing to compete by establishing their own website and competitive online buying experience.

Rather than taking a hard stance on whether to partner with Amazon Business, Scott and Brian agreed that Amazon Business’ biggest impact is in driving the need for an ecommerce strategy. The rise of Amazon in both the B2C and B2B space shows that buyer preferences and habits have shifted. Consider the startling stat Beck shared – 50 percent of product searches now start on Amazon. Now more than ever it is important to recognize the importance of establishing and executing on an ecommerce strategy. Refusing to do so means falling behind in a dynamic world.

Strategically Partner with Amazon Based on Your Business Model 

For those who are considering, or choose to partner with Amazon Business, it is important to recognize the differences in partnership options available.

Vendor Central/ 1st Party/ 1P: Amazon sells your products on a wholesale relationship. You send your inventory to Amazon, they control your pricing and your listing displays as “Ships from and sold by Amazon.com”.

Seller Central/ 3rd Party/ 3P: You sell your products on Amazon’s marketplace.

  • With this, you either have Amazon fulfill your orders from their fulfillment centers (Fulfillment by Amazon, FBA) or you can fulfill orders from your own warehouse or 3rd party warehouses (Fulfillment by Merchant, FBM).

Each option comes with pros and cons and will ultimately depend on the level of control and responsibility management is looking for. Many companies choose to partner with an agency to help them identify the best Amazon strategy, as well as to directly manage their Amazon partnership.

Amazon has shown that it is not shying away from competing in the B2B marketplace. As a business, it is up to you to determine whether partnering with Amazon Business is the right strategy for your organization. No matter your decision, it is important to recognize the influence Amazon Business is having on the B2B industry – and make the necessary adjustments to your business model now.

Access the “From B2C to B2B: Amazon’s Entry into the B2B World” webinar recording here.

Posted on Tue, May 22, 2018
by NetSuite filed under

Let’s block ads! (Why?)

The NetSuite Blog

GDPR-friendly Web Forms that Feed into Dynamics 365

With the deadline to meet the GDPR (General Data Protection Regulation) fast approaching, we want to make it easier for our customers to collect the consent they need from their clients, and track that in Dynamics 365. We’re going to be using the PowerWebForm add-on, which allows users to quickly and easily create a web form to post on their website, and feed the web form submissions directly into Dynamics 365. The GDPR-friendly form can be accomplished in just a few easy steps.

1. Import the PowerWebForm add-on if you don’t already have it in your system, and register for a free, 30 day trial.

2. Create two new fields on your lead and contact entities: Consented, and consent date (or whatever you’d like to call them). The “Consented” field should be a Two Option type field, with the values Yes and No. The Consented Date field should be a Date type field.

3. Build the web form. To do this, navigate to the web form entity, under the PowerPack area and click New. Fill in the required information and Save (more details on exactly what each field does, can be found in the user guide). We’re choosing to create leads with our web form submissions, but you can create whatever you’d like. The cool thing about PowerWebForm is that duplicate detection is built in – for multiple record types! – so if the lead (or contact) already exists in the system, it won’t create a duplicate.

052118 1626 GDPRfriendl1 GDPR friendly Web Forms that Feed into Dynamics 365

4. Create fields for the web form, by scrolling down to the form fields section of the web form record and clicking the + sign on the grid. Make sure to create a Consent checkbox, which will be mapped to the field you created above. If you make this a required field, the person filling out the form will have to check the check box, in order to submit their inquiry. We’re going to choose to leave ours as not required.

052118 1626 GDPRfriendl2 GDPR friendly Web Forms that Feed into Dynamics 365

5. (Optional) If you’d like to set up duplicate detection for not just leads (which is what the web form above is currently checking on) but also for contacts, navigate to the related Duplicate Check entity, and create a Duplicate Check for contacts as well.

052118 1626 GDPRfriendl3 GDPR friendly Web Forms that Feed into Dynamics 365

To recap, here’s what we’ve done so far: Created a web form that looks for a duplicate lead in the system based on the email address entered on the form, and the email address entered on the lead, when a form is submitted. If a duplicate is not found, it will then go and look for a duplicate contact, based on the email address on the contact. If a duplicate is still not found, a new lead will be created and a web form activity will be created and associated to the newly created lead. If a duplicate lead OR contact is found, a web form activity will be created and associated to the existing lead. (note: the information on the lead/contact will NOT be updated).

The next step is going to be to create a workflow that updates the “Consented Date” and “Consented” fields on the newly created, or existing lead or contact, if they’ve checked the “Consented” check box on the web form. This workflow will only update those two fields, if the person submitting the form DID consent. So, we’ll need a parent workflow that checks the value of the form submission, and then a child workflow that updates the related lead or contact.

Parent workflow:

052118 1626 GDPRfriendl4 GDPR friendly Web Forms that Feed into Dynamics 365

Child workflow:

052118 1626 GDPRfriendl5 GDPR friendly Web Forms that Feed into Dynamics 365

Update step:

052118 1626 GDPRfriendl6 GDPR friendly Web Forms that Feed into Dynamics 365

Note: If you are doing this on multiple web forms, make sure that the “consented” field on the web form is exactly the same across all web forms. If it is, then you only have to build one set of workflows, as listed above, since the value will be the same if someone consents, no matter what form it’s on.

As always, it’s important to test your entire process, so you know that it’s working correctly. But that’s it! You now have a process that asks specifically for consent and tracks it back to the leads/contacts in your system.

For more helpful tips and tricks – subscribe to our blog!

Happy D365’ing!

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Lucidworks raises $50 million for companies to build smarter search into their applications

Lucidworks, a company that meshes big data with artificial intelligence (AI) to help companies build smart search-based applications, has raised $ 50 million in a round of funding led by Top Tier Capital Partners, with participation from Silver Lake’s growth capital fund, Silver Lake Waterman.

Founded out of California in 2007, Lucidworks enables companies to design and develop search technology for myriad applications, from financial services to ecommerce. Lucidworks’ platform basically allows companies to incorporate intelligent search features into their products without having to build their own systems from scratch and enables them to better compete with the likes of Amazon or Google.

Above: Lucidworks search example

Lucidworks had previously raised around $ 59 million in funding, and with another $ 50 million in the bank the company plans to expand its enterprise product and “help the world’s leading companies bring smart data experiences to the market,” according to a statement.

Lucidworks’ promise, ultimately, is to bring relevant search results based on user intent garnered from various online signals. Plus, the product makes it easier to introduce such search features as product recommendations, spell-check, auto-suggestions based on partial keywords, results tailored to a specific device, and synonym management.

Reddit and weep

One of Lucidworks’ most high-profile partnerships in recent times was with Reddit, which announced last year that it was building a new search infrastructure to better surface relevant content. As the “front page of the internet,” Reddit was a major score for Lucidworks in the consumer realm. Other clients include Uber, Staples, Qualcomm, Dell, AT&T, Red Hat, and the Financial Times.

Alongside this funding round, Lucidworks CEO Will Hayes also sought to clarify what his company is and isn’t and suggested that its technology has been a little misunderstood over the years.

“The industry hasn’t done a good job categorizing our technology. Lucidworks has the potential to impact virtually every industry by making complex data science available to end users directly,” he said. “The true power of that potential just isn’t captured by ‘enterprise search’ or ‘insight engine’ or any of the other industry language currently in use. We’re building around the idea of ‘smart data experiences’. As a company, this is the direction that guides us beyond artificial intelligence, machine learning, neural networks, and all the buzzwords. We want to solve the last mile problem in AI: how to make sure that users who can most benefit from insights discover them without having to be PhDs in data science.”

Sign up for Funding Daily: Get the latest news in your inbox every weekday.

Let’s block ads! (Why?)

Big Data – VentureBeat