Category Archives: Business Intelligence

China's JD To Set Up Two JVs With Central Group In Thailand

JD Group, JD Finance, Thailand’s leading retailer Central Group, and Provident Capital together announced that they would set up two joint ventures in Thailand to provide e-commerce service and fintech services with total investments of USD500 million.

According to the agreement, Central Group will provide half of the funds; while JD Group, JD Finance, and JD’s Indonesian e-commerce business strategic cooperating partner Provident Capital will provide the other half.

JD will provide extensive professional skill support to the e-commerce joint venture, covering technology, e-commerce, and logistics. JD Finance will offer related fintech experience and technologies such as artificial intelligence, cloud computing, and other industry-leading technologies.

At the same time, Central Group will use its retail resource advantages, including a physical store network, brand relationships and vendor resources, and its popular membership program “The 1 Card”, to contribute for the development of the two joint ventures. In addition, Central Group will open several flagship stores on the platform of the e-commerce joint venture to introduce its department store, major retail chain stores, and its owned or operated brands.

Liu Qiangdong, chairman and chief executive officer of JD Group, said that Thailand has a large population and developed infrastructure, including a strong national logistics network, which represents huge potential for the development of e-commerce and fintech. By cooperating with a strong retail group like Central Group, JD will gain a huge competitive advantage in its expansion in Southeast Asia.

Tos Chirathivat, chief executive officer of Central Group, said that JD Group achieved huge successes in the Chinese e-commerce market and it is the best choice for an e-commerce partner. With the growing of mobile Internet users and improving of customer spending power, Thailand’s e-commerce will soon have an explosive development. By cooperating with JD, Central Group will seize this opportunity to become a leading e-commerce provider in Thailand.

Let’s block ads! (Why?)

ChinaWirelessNews.com

Amazon, Google, Facebook, Microsoft: The scramble to beat Apple, dominate hardware, and own your future

 Amazon, Google, Facebook, Microsoft: The scramble to beat Apple, dominate hardware, and own your future

Google’s $ 1.1 billion acquihire of HTC’s Pixel team is just the latest example of a mega trend that is turning the tech industry upside down and has enormous consequences for the role technology plays in all of our lives.

Put simply, the biggest of big tech companies have decided that to win, they must control every aspect of how we interact with technology. And that means building out Apple-like ecosystems of gadgets that capture as much of your data as possible and keep it within their very own walled gardens.

The promise to consumers is to create, like Apple, a system that offers ease and simplicity of use, and rewards users for investing more in a single brand. By the same measure, those systems become a trap that eliminates choice over time as they limit compatibility with competing systems.

In rocketing to becoming the world’s most valuable company, Apple has demonstrated the profound appeal of this integrated approach. With some exceptions now and then, Apple’s products are seductively simple to connect with each other, and it’s easy to manage a single account across those gadgets. At home, I have an Apple TV, iPad, 3 iPhones, and a Macbook Air. These all communicate seamlessly. I haven’t taken the Apple Watch plunge, though, and have no plans to get one.

Now, Apple is extending that ecosystem with the HomePod, a voice-activated speaker powered by Siri. But if you want to listen to music, you’ll need to have an Apple Music subscription. There is some speculation that at some point, Apple could open it up to third-party developers. For the moment, it will be a closed system when it goes on sale in December.

Of course, for Apple, this integration of hardware and software has been a consistent approach. And because the company doesn’t overtly make money from users’ data, it’s a bit less ominous, I suppose.

On the other hand, for rivals like Google, Amazon, and Facebook, that data is now, and will increasingly, be used in the future for ad targeting, marketing campaigns, and efforts to get you to buy more and more.

Google-Alphabet has been going down this road for sometime now, as it has tried with mixed success to become a hardware company. Its acquisition of Nest, its Chromecast stick for TV, Nexus phones, Chromebooks, and Google Home have helped ease the pain of failures like Google Glass. Still, this isn’t enough for Google, which 18 months ago hired Rick Osterloh to be its senior vice president of hardware.

It was Osterloh who apparently oversaw the deal with HTC and the decision to basically bring in-house the production of the Pixel phone, its attempt to create a flagship Android device.

“Our team’s goal is to offer the best Google experience — across hardware, software and services — to people around the world. We’re excited about the 2017 lineup, but even more inspired by what’s in store over the next five, 10, even 20 years,” he wrote in a blog post about the HTC deal. “Creating beautiful products that people rely on every single day is a journey, and we are investing for the long run.”

It is an innocent-sounding statement that is less than innocent.

Google is already under fire in Europe for abusing its search engine monopoly to beat back competitive shopping services. The EU is also investigating whether it used its Android OS to force handset manufacturers and consumers to use various Google services on their phone.

Google is planning to make more announcements about its hardware plans at an event on October 4. Even if Google products allow third-party developers access to the hardware, as with Android, there are major concerns that with the default being Google’s voice assistant or search, the company is hoping to grab data that might otherwise flow to, say, Apple or any other competitor.

Amazon’s push to lasso the physical and the virtual also has troubling overtones. From its start with the Kindle reader, and later the Amazon Fire TV stick, the company has a growing portfolio of products such as the Echo now driven by its Alexa voice assistant. The company is also eager for other hardware developers to build Alexa capability into third-party hardware. And it’s attracting a growing number of developers who are creating skills for Alexa.

The company has other niche hardware products, like the Dash buttons for ordering more goods, the Dash Wand for reading barcodes at home, and the Echo Look connected camera. This is creating a massive storehouse of information about user behavior and user data. How they mix and match that data is going to draw close scrutiny from regulators and privacy advocates, particularly as they invest more in physical stores like Whole Foods.

And then comes Facebook, which for the moment has made its biggest splash with its acquisition of Oculus VR. But the company increasingly realizes where the game is headed. A year ago, Facebook publicly announced it had built a 22,000 square foot hardware lab. Some of that work focuses on its internal IT projects, like servers and switches. And some focuses and its various efforts to expand connectivity to remote areas.

But earlier this year, the company unveiled its Surround 360 cameras. Now there are reports it’s building a video chat device and its own smart speaker. Even possibly a modular smartphone.

Microsoft, in contrast, has tempered its hardware ambitions after a big push a few years ago. Though its phones have flopped and its Nokia deal was a bust, it has gotten traction with its Surface tablet, and of course the Xbox has had a long run as a top video gaming platform. Likely, Microsoft remains chastened by its various antitrust battles over the years, along with some of these costly flops. And like Google’s Android, it’s a bit restrained by its need to manage its long-standing OEM relationships.

But it’s also in some ways trying to zig away from these other tech giants by becoming easier to use on other platforms, particularly with some of its Office apps, in an effort to extend its appeal. And it’s focusing more on its cloud services competition against Google and Amazon.

Still, overall, the direction is clear. Consumers are going to be facing tough choices not just about a single product or service, but about the larger implications of what it means in terms of when and where that service or product might work. They are going to be forced to grapple with a larger number of interoperability issues as they try to figure out which service works with which hardware, and which hardware can be paired with which other hardware.

The path of least resistance, over time, will be to simply buy hardware and services from one company. And once you do, the barrier to switching to other products will become higher and higher. Companies will talk about being “open,” but they will only kinda sorta mean it.

While U.S. antitrust and consumer protection has become relatively toothless in recent years, these companies can also likely look forward to more intense scrutiny and legal battles with the European Union. The EU has been far more aggressive in ensuring that these companies don’t use their might to limit consumer choices. The expanding walled garden of hardware would seem to be painting a big red target on the backs of these tech companies.

In the meantime, consumers ought to remain vigilant. Each gadget unto itself can be fun, seductive, and even actually useful. But they should make sure that as they step further into the embrace of one company that they aren’t entering into a relationship from which there is no exit.

Let’s block ads! (Why?)

Big Data – VentureBeat

Power BI Embedded capacity-based SKUs coming to Azure

Today, Microsoft announced new Power BI Embedded SKUs will be available in Azure beginning Oct. 2.

Power BI Embedded is intended to simplify how ISVs and developers use Power BI capabilities, helping them quickly add stunning visuals, reports and dashboards into their apps – in the same way apps built on Azure leverage services like Machine Learning and IoT. By enabling easy-to-navigate data exploration in their apps, ISVs allow their customers to make quick, informed decisions in context.

In May we announced the convergence of the Power BI and Power BI Embedded services. The convergence delivered one API surface, a consistent set of capabilities and access to the latest features across both services. Additionally, we introduced a capacity-based pricing model, simplifying how Power BI is consumed.

With Power BI Embedded, ISVs and developers will now also have added flexibility in how they embed intelligence in their apps using the Power BI APIs. ISVs and developers can take advantage of minimized development efforts to achieve faster time to market and differentiate themselves by infusing Microsoft’s world-class analytics engine in their app. Equally, developers can spend time focusing on their solution to meet customer demands, instead of developing visual analytics features. Additionally, Power BI Embedded enables you to work within the familiar development environments – Visual Studio and Azure – you already use.

Have an existing app with embedded Power BI content using Power BI Premium? If you are an ISV or developer delivering apps or an organization using them, no action is needed – you (and your customers) can continue using these apps without interruption. If you have an existing app built on Power BI Workspace Collections and are interested in taking advantage of the converged API surface and the new, capacity-based Azure SKUs, visit documentation for migration guidance.

New to embedding visuals, reports, and dashboards in your apps? ISVs and developers should look to Power BI Embedded – follow these steps to get started:

1. Setup the Power BI Embedded environment for development

Setup your environment for testing. Make sure you have an Azure Active Directory (Azure AD) tenant. You can use an existing tenant or create a new one. Then, create an Azure AD user and signup for the Power BI service with that user with a Power BI Pro license. Register your app in Azure AD and then create an App Workspace in Power BI so you can publish reports.

Learn more

2. Embed content

Integrate your Power BI content into your application using the Power BI and JavaScript APIs. Authenticate with your Power BI account to embed your dashboards, reports and tiles. 

Get hands-on experience with the Power BI – Report Embed Sample

3. Publish your solution to production

Once you’re ready, register your application in your production environment.

Learn more

Power BI Embedded utilizes Azure consumption-based, hourly pricing model with as well as the ability to pause and resume the service, and scale up/down or out/in as necessary without commitments. Watch the Azure website in the coming days for more information on SKU details, pricing and availability.

Let’s block ads! (Why?)

Microsoft Power BI Blog | Microsoft Power BI

Embed your Power BI report with predefined filters

I have recently gotten this questions a few times so time to dust off my coding skills  Embed your Power BI report with predefined filters and get to work … the key question in both cases is how can I make sure I filter my report before any queries are being send to the underlying data source.

When using Power BI embedded you can use Javascript to pass in a filter instead of using the Power BI UI. To test how this worked I installed the sample app from GitHub: https://github.com/Microsoft/PowerBI-Developer-Samples and then uploaded a report to my Power BI that uses a on prem AS model. I use that so I can profile the queries easily and show you how it works. The same would apply for any data source.

Here is the embedded report that currently shows all colors:
 Embed your Power BI report with predefined filters

The DAX query send to my SSAS is the following:

EVALUATE
TOPN(
     1002,
     SUMMARIZECOLUMNS(
                      'DimProduct'[ColorName],
                       "Sum_of_SalesAmount", 'FactOnlineSales'[Sum of SalesAmount]
                    ),
    [Sum_of_SalesAmount],
    0,
    'DimProduct'[ColorName],
    1
)

ORDER BY
[Sum_of_SalesAmount] DESC, 'DimProduct'[ColorName]

Now lets filter it on load. Let’s look at filter method 1 (and the one you see most typical): https://github.com/Microsoft/PowerBI-JavaScript/wiki/Filters.

This JavaScript method uses the report loaded event to see if the report is loaded and then push in the filter into the report:

const filter = {
        $  schema: "http://powerbi.com/product/schema#basic",
        target: {
            table: "DimProduct",
            column: "ColorName"
        },
        operator: "In",
        values: ["Silver"]
    };

var config = {
              type: 'report',
              tokenType: models.TokenType.Embed,
              accessToken: accessToken,
              embedUrl: embedUrl,
              id: embedReportId,
              permissions: models.Permissions.All,
              settings: {
                            filterPaneEnabled: true,
                            navContentPaneEnabled: true
              }
};

// Embed the report and display it within the div container.
var report = powerbi.embed(reportContainer, config);

report.on('loaded', event => {
        report.getFilters()
            .then(filters => {
                filters.push(filter);
                return report.setFilters(filters);
            });

When we run this we do see the report is getting filtered:
 Embed your Power BI report with predefined filters

Then I see 2 queries being send:

EVALUATE
TOPN(
     1002,
     SUMMARIZECOLUMNS(
                      'DimProduct'[ColorName],
                      "Sum_of_SalesAmount", 'FactOnlineSales'[Sum of SalesAmount]
                    ),
                    [Sum_of_SalesAmount],
                    0,
                    'DimProduct'[ColorName],
                    1
)

ORDER BY
[Sum_of_SalesAmount] DESC, 'DimProduct'[ColorName]

And

DEFINE VAR __DS0FilterTable =
FILTER(KEEPFILTERS(VALUES('DimProduct'[ColorName])), 'DimProduct'[ColorName] = "Silver")

EVALUATE
TOPN(
1002,
SUMMARIZECOLUMNS(
                 'DimProduct'[ColorName],
                  __DS0FilterTable,
                  "Sum_of_SalesAmount", 'FactOnlineSales'[Sum of SalesAmount]
               ),
               [Sum_of_SalesAmount],
               0,
               'DimProduct'[ColorName],
               1
)

ORDER BY
[Sum_of_SalesAmount] DESC, 'DimProduct'[ColorName]

That is not what we want, we want to send only the last one. But this is to be expected, if we again look at the JavaScript we see that we are only pushing the new filter AFTER the ‘loaded’  event on the report has been triggered.  Now this is great in the case where you want to write code that pushes in filters dynamically from another app but not if you want to filter the page already filtered.

Now there is another option and that that allows you to set a particular configuration on report load (and not AFTER).  So I change my embed config to include filters as described here: https://github.com/Microsoft/PowerBI-JavaScript/wiki/Embed-Configuration-Details

const filter = {
        $  schema: "http://powerbi.com/product/schema#basic",
        target: {
            table: "DimProduct",
            column: "ColorName"
        },
        operator: "In",
        values: ["Silver"]
    };

var config = {
              type: 'report',
              tokenType: models.TokenType.Embed,
              accessToken: accessToken,
              embedUrl: embedUrl,
              id: embedReportId,
              permissions: models.Permissions.All,
              filters: [filter],
              settings: {
                            filterPaneEnabled: true,
                            navContentPaneEnabled: true
              }
};

// Embed the report and display it within the div container.
var report = powerbi.embed(reportContainer, config);

Observe that the filters parameter is an arrary so you can pass in multiple filters if you want.

This now shows us the right report:

 Embed your Power BI report with predefined filters

And sends only the one query:

DEFINE VAR __DS0FilterTable =
FILTER(KEEPFILTERS(VALUES('DimProduct'[ColorName])), 'DimProduct'[ColorName] = "Silver")

EVALUATE
TOPN(
     1002,
     SUMMARIZECOLUMNS(
                      'DimProduct'[ColorName],
                      __DS0FilterTable,
                      "Sum_of_SalesAmount", 'FactOnlineSales'[Sum of SalesAmount]
     ),
     [Sum_of_SalesAmount],
     0,
     'DimProduct'[ColorName],
1
)

ORDER BY
[Sum_of_SalesAmount] DESC, 'DimProduct'[ColorName]

That shows the 2 ways of setting filters in Power BI Embedded, one for each scenario. The one thing I want to mention here is that the filters are placed as part of the JavaScript and thus CANNOT be used as a security feature as anyone can change it.

Let’s block ads! (Why?)

Kasper On BI

Enhance Your Call Center Operations with RapidMiner Server

Many past webinars have focused on the technical details of Rapidminer products—but last week’s webinar put into context what it takes to put models into production to deliver business outcomes using RapidMiner Studio and RapidMiner Server. 

In a recent webinar, Derek Wilson, president and CEO of CDO Advisors LLC, looks at three different use cases to help you use RapidMiner to solve business problems related to call center optimization, including agent churn, customer churn, cross-selling, and ultimately how to put predictive models into production to lower costs and increase customer satisfaction.

Agent & Customer Churn—RapidMiner Studio to RapidMiner Server

Building an accurate predictive model is important—but the real goal is making the model outcome actionable. The outcome of this model will help you discover which attributes in agents/customers lead to high attrition. That information can then be translated into agent training that weeds out candidates with attrition attributes and retains the best agents or changing processes that lead to customer attrition and switching services to a better fit for customers.

First, input agent performance, call center stats, HR data or customer CRM data and operations data. Output that model to your call center management team and into a decision tree so that the results are comprehensible to those who are not necessarily data scientists.

agent churn wf basic Enhance Your Call Center Operations with RapidMiner Serveragent churn workflow Enhance Your Call Center Operations with RapidMiner Server

Next, you pull in RapidMiner Server so that you can operationalize and focus in on key attributes over periods of time without having to sit there and manually run your model in Studio over and over. Simply deploy your RapidMiner Studio model into RapidMiner Server, run a query that goes directly against the database and when you automate it you can choose to drop information out and build reports on top of it.

You can schedule the model, save the results and provide reports to your management team. Incorporating Server takes your models to the next level where your business questions get answered with action. Above, see the differences between the agent churn workflow basic versus the advanced workflow after Server comes into play.

Cross-Selling Opportunities

Can you predict which customers are most likely to respond to a marketing campaign? All you need is information on the responsiveness of past customers to apply to an active campaign list. The input would be the same as for your customer churn model. If the output aims true, (prediction of which customers are most likely to accept a new product), you will be able to target customers for products and consequentially limit your budget to those customers with 100% accuracy.

homeloan Enhance Your Call Center Operations with RapidMiner Server

Let’s build a financial cross-sell and output our files to the marketing and sales teams. RapidMiner Server saves the day again when it lets you create automations and build associations rules after publishing your RapidMiner Studio models to it. Above, we are looking at the lift association rule built in RapidMiner Server for a model that concludes that if a customer has an auto loan, they should have a home loan. Information like this can help you create campaigns that target people within different subsets or parameters which can help the marketing team to make their campaigns as specific and strategic as possible.

You can publish the output of your model in SQL, wrap it with any reporting tool you’d like and share your premise, conclusion, confidences and lift to various teams within your company by feeding your SQL Server table back into your CRM engine.

RapidMiner

pulling them Enhance Your Call Center Operations with RapidMiner Server

Using RapidMiner Server and RapidMiner Studio will help you to get more targeted results that benefit your whole company and build a foundation of intersectional trust between all teams.

Watch this on-demand webinar and see the full demonstration, master these business strategies and find out how RapidMiner can help you achieve your goals.

Let’s block ads! (Why?)

RapidMiner

Using CUSTOMDATA and SSAS with Power BI Embedded

Another Power BI embedded blog post while I have a playground up and running. Often when you want to use SSAS with Power BI embedded you have a “authentication issue”. Power BI and SSAS both leverage AD so that means that any user you pass from Power BI to SSAS needs to be known in AD. Now that is great in enterprise scenario’s where this usually is the case (and even required) but for ISV’s and any custom application this is a bigger problem. When embedding Power BI in a custom app you generally want to use the custom app’s authentication and not AD. Traditionally SSAS allows you to solve this using CUSTOMDATA and ROLES on the connection string https://docs.microsoft.com/en-us/sql/analysis-services/instances/connection-string-properties-analysis-services. With Power BI you cannot access the connectionstring so it is a bit harder. But there is a way. Let’s solve this issue with Power BI.

First of course I have a SSAS model, for convenience sake I added a simple measure CustomData = Customdata() that will allow us to see what has been passed in.

I start with a PBI desktop file that points to the SSAS model and create a very simple report. You see the CustomData measure doesn’t return anything as it hasn’t been set on the connection string:

 Using CUSTOMDATA and SSAS with Power BI Embedded

Next I publish the report to Power BI and set up a data gateway.  Here I am using a AD user that is admin to SSAS as the username and password (usually a service account)

 Using CUSTOMDATA and SSAS with Power BI Embedded

In the Power BI service the report looks identical:

 Using CUSTOMDATA and SSAS with Power BI Embedded

Now by default Power BI sets the EffectiveUsername property on the connection string as we can see in SQL Server Profiler output below on my Discover end (pro tip: Always have this up and running when testing like this, it is a real treasure trove  Using CUSTOMDATA and SSAS with Power BI Embedded):

SalesModel2.0SalesModelPowerBIDOMAIN\USERNAME103345052SchemaDataTabularebe7a284-b140-4bd9-8c38-e162e36d8f9966a221b2-37b9-283a-1064-6d0c2fb6a4a0c882c8d1-60e8-4c1e-8beb-01865c2751c7
< /PropertyList>

Now here comes the interesting part. In the gateway configuration part we can actually configure Power BI to also send the customdata propertie. This is hidden away in the Map user names pane:

 Using CUSTOMDATA and SSAS with Power BI Embedded

As soon as you check this you will see 2 things. One the report now shows my email address as result from the CUSTOMDATA function

 Using CUSTOMDATA and SSAS with Power BI Embedded

But also in SQL profiler you will see Customdata showing up

SalesModel2.0SalesModelPowerBIDOMAIN\USERNAMEusername@domain.com103345052SchemaDataTabularebe7a284-b140-4bd9-8c38-e162e36d8f9966a221b2-37b9-283a-1064-6d0c2fb6a4a0c882c8d1-60e8-4c1e-8beb-01865c2751c7

Part 1 completed. Now let’s start the Power BI embedding part.

Again I used the PowerBIEmbedded_AppOwnsData sample to get started and get up and running, again by default when you run it will act the same way as Power BI and the credentials used as the service account will be the one passed into SSAS. Now as explained in this Power BI help document you can change the sample to pass in any username you want. Now here is the key, because we configured the gateway to use CUSTOMDATA instead of the EffectiveUsername it will change the CUSTOMDATA part with whatever you specify. So in my sample project (HomeController.cs) I replaced :

var generateTokenRequestParameters = new GenerateTokenRequest(accessLevel: "view");
var tokenResponse = await client.Reports.GenerateTokenInGroupAsync(GroupId, report.Id, generateTokenRequestParameters);

with

var generateTokenRequestParameters = new GenerateTokenRequest("View", null, identities: new List { new EffectiveIdentity(username: "username", roles: new List { "roleA" }, datasets: new List { report.DatasetId.ToString() }) });
var tokenResponse = await client.Reports.GenerateTokenInGroupAsync(GroupId, report.Id, generateTokenRequestParameters);

Now things get interesting on the SSAS side. We see something we might not expect as we are connecting to SSAS as administrator:

Either the user, ‘DOMAIN\USERNAME’, does not have access to the 'Sales' database, or the database does not exist.

The reason we are getting this is because we also specified the role. Going back to the connection string document in MSDN we can read the following on Roles:

Specify a comma-delimited list of predefined roles to connect to a server or database using permissions conveyed by that role. If this property is omitted, all roles are used, and the effective permissions are the combination of all roles. Setting the property to an empty value (for example, Roles=’ ‘) the client connection has no role membership.An administrator using this property connects using the permissions conveyed by the role. Some commands might fail if the role does not provide sufficient permission.

So this means that on this connection he is no longer admin and he needs a role to connect to. So let’s add one, I go into SSMS and add the role with the role name “roleA” as defined in the code above with Read rights:
 Using CUSTOMDATA and SSAS with Power BI Embedded

Next you have to make sure you add the service account as member and lastly you can add a RLS expression.

 Using CUSTOMDATA and SSAS with Power BI Embedded

In my case I added something hard coded:

=DimProduct[ColorName] = IF(CUSTOMDATA() = "username", "Silver", BLANK())

but of course you can follow the complete RLS pattern here and just replace USERNAME for CUSTOMDATA.

Running the report shows what we wanted to see, it only shows Silver colors and returns “username” out of the CUSTOMDATA function:

 Using CUSTOMDATA and SSAS with Power BI Embedded

Looking at the profiler trace we can again see it passes in the right CUSTOMDATA field:

Sales Model 9 0BA0F14D-56AA-4F8A-BC65-93FD2556FD8E 0A1AEB37-E7C5-45EB-8F75-8816A0648B0C 0BA0F14D-56AA-4F8A-BC65-93FD2556FD8E 1033 roleA DOMAIN\USERNAME PowerBI username

That’s it folks  Using CUSTOMDATA and SSAS with Power BI Embedded This opens a lot of interesting scenario’s for connecting to AS from Power BI. Unfortunately this doesn’t work for Azure AS as you cannot use the gateway to connect to it.

Let’s block ads! (Why?)

Kasper On BI

New eBook Alert: Know What You Don’t Know About Your Customers

In the age of eCommerce, there is a great need for companies to verify the identity of a given customer in real time. From credit card fraud to missed opportunities, the issues of incomplete, false or unreliable customer data can have a big effect on the bottom line of your business.

Our latest eBook, “Know What You Don’t Know About Your Customers,” looks at how you can reduce your risk while capitalizing on data-driven opportunities in a three-step strategy to real-time customer data verification.

blog banner eBook Know What You Dont Know 1 New eBook Alert: Know What You Don’t Know About Your Customers

Download the eBook now!

Let’s block ads! (Why?)

Syncsort + Trillium Software Blog

Cumulative Update #5 for SQL Server 2016 SP1

The 5th cumulative update release for SQL Server 2016 SP1 is now available for download at the Microsoft Downloads site. Please note that registration is no longer required to download Cumulative updates.

To learn more about the release or servicing model, please visit:

Let’s block ads! (Why?)

SQL Server Release Services

Foxconn Wants To Build New Factory In Nanjing

Foxconn signed an agreement with the municipal government of Nanjing, Jiangsu Province, planning to invest over CNY37.5 billion to build a new smartphone plant in the city.

Under this basic agreement, Foxconn’s new plant will not only produce smartphones, but also will include a LCD TV factory and R&D center, facility for semiconductor manufacturing equipment, and logistics center. The municipal government of Nanjing will probably provide capital and land support in infrastructure construction, but totals have not yet been made public.

Foxconn confirmed this investment plan, but did not reveal any details. The company already has a smartphone factory in Nanjing, but it is small. The construction of this new plant indicates the company’s intention to expand its production system and gain more orders from the growing Chinese smartphone brands.

Foxconn decided in February to build a new panel factory with a scale of JPY1 trillion in Guangzhou. In addition, the company also plans to build a LCD panel factory in Wisconsin, U.S., with an investment of USD10 billion.

Let’s block ads! (Why?)

ChinaWirelessNews.com

Equifax says server first compromised on March 10

 Equifax says server first compromised on March 10

(Reuters) — Equifax Inc said on Wednesday that investigators had determined that an online dispute website at the heart of the theft of some 143 million consumer records was initially compromised by hackers on March 10, four months before the company noticed any suspicious activity.

It disclosed the findings after details of a report by cyber-security firm FireEye Inc that was sent to some Equifax customers were reported by the Wall Street Journal earlier on Wednesday.

The report, which was obtained by Reuters, described the techniques that the unknown attackers used to compromise Equifax, including exploitation of a vulnerability in a software known as Apache Struts that was used to build the online dispute website.

It is not clear whether the March hackers were the same ones who later stole the vast cache of personal information. Equifax also said a previously reported incident in which some W-2 forms were compromised, also in March, was entirely unrelated.

The FireEye report said the firm was unable to determine who was behind the attack, and that it had never seen a hacking group employ the same tools, techniques and procedures as those used against Equifax.

A FireEye spokesman declined to comment on the report.

Equifax said in a statement to Reuters that a hacker “interacted with” the server on March 10, but that there was no evidence that the incident was related to the theft of sensitive consumer data that began in May.

The Wall Street Journal report said that hackers had roamed undetected inside Equifax’s network for four months before the massive breach was detected in July by the company’s security team. Equifax disputed that claim.

“There is no evidence that this probing or any other probing was related to the access to sensitive personal information” in the massive breach disclosed on Sept. 7, the company said in its statement.

Equifax shares have shed almost a third of their value since the disclosure of the breach. Critics have questioned why Equifax took so long to discover and disclose the breach.

One security expert who reviewed the FireEye report said that it was too soon to say whether the March 10 incident was related to the massive hack.

“They’ve had so much overlapping activity that it’s difficult to pick a single thread out of the noise,” said the expert, who was not authorized to discuss details of the confidential report.

Let’s block ads! (Why?)

Big Data – VentureBeat