• Home
  • About Us
  • Contact Us
  • Privacy Policy
  • Special Offers
Business Intelligence Info
  • Business Intelligence
    • BI News and Info
    • Big Data
    • Mobile and Cloud
    • Self-Service BI
  • CRM
    • CRM News and Info
    • InfusionSoft
    • Microsoft Dynamics CRM
    • NetSuite
    • OnContact
    • Salesforce
    • Workbooks
  • Data Mining
    • Pentaho
    • Sisense
    • Tableau
    • TIBCO Spotfire
  • Data Warehousing
    • DWH News and Info
    • IBM DB2
    • Microsoft SQL Server
    • Oracle
    • Teradata
  • Predictive Analytics
    • FICO
    • KNIME
    • Mathematica
    • Matlab
    • Minitab
    • RapidMiner
    • Revolution
    • SAP
    • SAS/SPSS
  • Humor

Tag Archives: Move

Move Your High-performance Computing Environment to the Cloud with TIBCO and Microsoft

February 5, 2021   TIBCO Spotfire
TIBCO Gridserver scaled e1611949936749 696x365 Move Your High performance Computing Environment to the Cloud with TIBCO and Microsoft

Reading Time: 2 minutes

We’ve got a New Years’ resolution for you: Begin modernizing your grid applications. Running on Microsoft Azure, the TIBCO GridServer® solution provides performance, speed, and analytics benefits without the time, effort, and cost associated with maintenance, software upgrades, facility overhead, and personnel.

Key Benefits of TIBCO GridServer on Microsoft

TIBCO GridServer is a market-leading infrastructure platform for grid and elastic computing—and the backbone of businesses operating in the world’s most demanding markets. By combining Microsoft Azure and GridServer software, your organization can burst workloads to the cloud when the demand arises and reap the following benefits:

  • Accelerated computation-intensive applications: GridServer on Microsoft enables mission-critical risk management, complex pricing, and other computation-intensive applications to quickly execute. With this pairing, computational-intensive applications are processed in minutes or seconds rather than hours.
  • Elastic autoscaling, monitoring, and security: On Microsoft Azure, TIBCO GridServer infrastructure provides powerful, elastic grid computing and scalability. It can run millions of tasks using tens of millions of data points in parallel, allowing for better performance, quicker results, and more accurate analyses.  
  • Improves market risk management systems: GridServer infrastructure also provides the power and speed required for risk calculations and risk-related tasks. Potential use cases include asset pricing, high-frequency trading, trader limit management, and risk reporting.

TIBCO GridServer and Microsoft Azure in Action

One European financial institution with hundreds of software engineers uses GridServer infrastructure extensively when making risk and trading decisions. The engineering team deploys 200,000 virtual cores in multiple grids with many versions of Windows and Linux, providing massive compute power for real-time quantitative analysis and recommendations. Using TIBCO GridServer infrastructure on Microsoft Azure, the bank runs millions of calculations in half the time at 60 percent of the cost.

Using TIBCO GridServer infrastructure on Microsoft Azure, the bank runs millions of calculations in half the time at 60 percent of the cost. Click To Tweet

Register Now to Learn More

If modernizing your high-performance computing (HPC) environment on the cloud is in one of your 2021 resolutions, then joining TIBCO and Microsoft for an upcoming session on March 4th is a great way to start off the year. Technology experts will demonstrate how banks and insurance companies can quickly execute mission-critical services such as risk management, complex pricing, and other computation-intensive applications to decrease time-to-results and increase performance levels. Register now!

Let’s block ads! (Why?)

The TIBCO Blog

Read More

Using Power Automate to Automatically Move Your Notes Attachments to SharePoint

January 29, 2021   Microsoft Dynamics CRM

If you are like a lot of D365 users, you have probably run into an issue with storage space one time or another. This has been a tricky problem to solve in the past, but with Microsoft’s introduction of Power Automate, users now have the ability to not only move attachments over to SharePoint more easily, but also to automate and customize this process. In this post, we’ll not only show you how to…

Source

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

Using Power Automate to Automatically Move Your Email Attachments to SharePoint

January 28, 2021   Microsoft Dynamics CRM

If you are like a lot of D365 users, you have probably run into an issue with storage space one time or another. This has been a tricky problem to solve in the past, but with Microsoft’s introduction of Power Automate, users now have the ability to not only move attachments over to SharePoint more easily, but also to automate and customize this process. In this post, we’ll not only show you how to…

Source

Let’s block ads! (Why?)

PowerObjects- Bringing Focus to Dynamics CRM

Read More

Teradata Extends Hybrid Multi-Cloud Capabilities as Customers Move More Workloads to the Cloud

December 21, 2020   BI News and Info
teradata extends hybrid multi cloud capabilities.jpg?width=640&height=336&ext= Teradata Extends Hybrid Multi Cloud Capabilities as Customers Move More Workloads to the Cloud

Teradata QueryGrid Expands Access to Data with Broad Ecosystem Connectivity Enabling Customers to Leverage Data as their Greatest Asset

Teradata (NYSE: TDC), the cloud data analytics platform company, today announced it is extending the hybrid multi-cloud capability of Vantage, enabling customers to access data and analytics across heterogeneous technologies and public cloud providers with new cloud-native capabilities. With the latest release of Teradata QueryGrid, Teradata customers can connect to a vast array of new data sources regardless of where the data resides – in the cloud, on multiple clouds, on-premises, or any combination thereof – for timely and cost-effective analytics.

As organizations increasingly move all or part of their data infrastructure to the cloud or multiple clouds, the number of different data sources and processing engines continues to increase, as does the demand for timely access to data. Companies need to be able to connect to, access and combine information from all data environments at the same time, and at scale, to enable truly effective analytics. This scalable, parallel access to data sources helps companies leverage data as their greatest asset, providing them the information they need to make the best business decisions possible.

“Our customers are the largest and most complex companies in the world, and their data infrastructure and processing environments reflect this complexity. When it comes to these enterprises, it’s not a one-size-fits-all approach to data analytics. They require the ability to query multiple data sources across any environment – in the cloud, multi-cloud, hybrid or on-premises – at the same time, without sacrificing speed or accuracy, and at the lowest possible cost,” said Hillary Ashton, Chief Product Officer at Teradata. “With these cloud-native updates to Vantage’s high-speed data fabric, we continue to give our customers the utmost choice and flexibility in how and when they choose to access their data, while also eliminating data movement and increasing performance – optimizing cost and delivering analytics that matter.”

In addition to providing customers with parallel access to data at scale, Teradata QueryGrid also gives companies the ability to dynamically access data in hybrid and multi-cloud environments and push processing down to where the data resides, thereby reducing data movement, decreasing costs, and reducing time to query. This is especially useful when accessing data in cloud environments where there is an egress fee for exporting out data.

Starburst Enterprise Presto Integration

To extend the company’s reach to the most requested technologies, Teradata has partnered with Starburst Data to integrate a new Presto connector that allows Teradata Vantage users to leverage Starburst Enterprise Presto as part of QueryGrid. This integration with Starburst Enterprise Presto allows Vantage customers to connect to a vast array of new data sources that enable connectivity in a modern cloud architecture, including:

  • MongoDB,
  • Amazon Redshift,
  • Databricks Delta Lake,
  • Apache Cassandra,
  • Google BigQuery,
  • and numerous other data sources and processing engines.

“Starburst is on a mission to enable better and faster decisions by providing a single point of access to all data, no matter where it lives,” said Justin Borgman, Chairman and CEO, Starburst. “By combining Starburst Enterprise Presto with QueryGrid, we are able to offer our joint customers unprecedented reach to high-performance analytics across heterogeneous technologies and deployment platforms.”

Teradata Vantage with QueryGrid is certified on all the major cloud platforms including Google Cloud, AWS and Azure, and now has native integration with key cloud-native technologies such as Google Cloud Dataproc, Azure HDInsight, and Amazon EMR. 

For Teradata customers maintaining an on-premises presence or working with hybrid cloud systems, Teradata has also announced an upgrade to its IntelliFlex platform featuring the latest second Gen Intel® Xeon® Scalable processors with 33% more cores than the previous generation. The latest release of the IntelliFlex platform provides customers with up to 30% performance improvement on typical data warehouse workloads and up to 40% on CPU intensive workloads, as well as a 50% increase in memory per node.

Availability

The latest release of Teradata QueryGrid is available globally, today.

The latest update to Teradata Intelliflex is available globally, today.

Let’s block ads! (Why?)

Teradata United States

Read More

Why AI can’t move forward without diversity, equity, and inclusion

November 12, 2020   Big Data
 Why AI can’t move forward without diversity, equity, and inclusion

Maintain your employer brand in a pandemic

Read the VentureBeat Jobs guide to employer branding

Download eBook

The need to pursue racial justice is more urgent than ever, especially in the technology industry. The far-reaching scope and power of machine learning (ML) and artificial intelligence (AI) means that any gender and racial bias at the source is multiplied to the nth power in businesses and out in the world. The impact those technology biases have on society as a whole can’t be underestimated.

When decision-makers in tech companies simply don’t reflect the diversity of the general population, it profoundly affects how AI/ML products are conceived, developed, and implemented. Evolve, presented by VentureBeat on December 8th, is a 90-minute event exploring bias, racism, and the lack of diversity across AI product development and management, and why these issues can’t be ignored.

“A lot has been happening in 2020, from working remotely to the Black Lives Matter movement, and that has made everybody realize that diversity, equity, and inclusion is much more important than ever,” says Huma Abidi, senior director of AI software products and engineering at Intel – and one of the speakers at Evolve. “Organizations are engaging in discussions around flexible working, social justice, equity, privilege, and the importance of DEI.”

Abidi, in the workforce for over two decades, has long grappled with the issue of gender diversity, and was often the only woman in the room at meetings. Even though the lack of women in tech remains an issue, companies have made an effort to address gender parity and have made some progress there.

In 2015, Intel allocated $ 300 million toward an initiative to increase diversity and inclusion in their ranks, from hiring to onboarding to retention. The company’s 2020 goal is to increase the number of women in technical roles to 40% by 2030 and to double the number of women and underrepresented minorities in senior leadership.

“Diversity is not only the right thing to do, but it’s also better for business,” Abidi says. “Studies from researchers, including McKinsey, have shown data that makes it increasingly clear that companies with more diverse workforces perform better financially.”

The proliferation of cases in which alarming bias is showing up in AI products and solutions has also made it clear that DEI is a broader and more immediate issue than had previously been assumed.

“AI is pervasive in our daily lives, being used for everything from recruiting decisions to credit decisions, health care risk predictions to policing, and even judicial sentencing,” says Abidi. “If the data or the algorithms used in these cases have underlying biases, then the results could be disastrous, especially for those who are at the receiving end of the decision.”

We’re hearing about cases more and more often, beyond the famous Apple credit check fiasco, and the fact that facial recognition still struggles with dark skin. There’s Amazon’s secret recruiting tool that avoided hiring qualified women because of the data set that was used to train the model. It showed that men were more qualified, because historically that’s been the case for that company.

An algorithm used by hospitals was shown to prioritize the care of healthier white patients over sicker Black patients who needed more attention. In Oakland, an AI-powered software piloted to predict areas of high crime turned out to be actually tracking areas with high minority populations, regardless of the crime rate.

“Despite great intentions to build technology that works for all and serves all, if the group that’s responsible for creating the technology itself is homogenous, then it will likely only work for that particular specific group,” Abidi says. “Companies need to understand, that if your AI solution is not implemented in a responsible, ethical manner, then the results can cause, at best, embarrassment, but it could also lead to potentially having legal consequences, if you’re not doing it the right way.”

This can be addressed with regulation, and the inclusion of AI ethics principles in research and development, around responsible AI, fairness, accountability, transparency, and explainability, she says.

“DEI is well established — it makes business sense and it’s the right thing to do,” she says. “But if you don’t have it as a core value in your organization, that’s a huge problem. That needs to be addressed.”

And then, especially when it comes to AI, companies have to think about who their target population is, and then whether the data is representative of the target population. The people who first notice biases are the users from the specific minority community that the algorithm is ignoring or targeting — therefore, maintaining a diverse AI team can help mitigate unwanted AI biases.

And then, she says, companies need to ask if they have the right interdisciplinary team, including personnel such as AI ethicists, including ethics and compliance, law, policy, and corporate responsibility. Finally, you have to have a measurable, actionable de-biasing strategy that contains a portfolio of technical, operational, organizational actions to establish a workplace where these metrics and processes are transparent.

“Add DEI to your core mission statement, and make it measurable and actionable — is your solution in line with the mission of ethics and DEI?” she says. “Because AI has the power to change the world, the potential to bring enormous benefit, to uplift humanity if done correctly. Having DEI is one of the key components to make it happen.”


The 90-minute Evolve event is divided into two distinct sessions on December 8th:

  1. The Why, How & What of DE&I in AI
  2. From ‘Say’ to ‘Do’: Unpacking real world case studies & how to overcome real world issues of achieving DE&I in AI

Register for free right here.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

Deadline to move to the Dynamics 365 Unified Interface set to December 1, 2020

August 25, 2020   Microsoft Dynamics CRM
crmnav Deadline to move to the Dynamics 365 Unified Interface set to December 1, 2020

For many organizations, the web client interface for Dynamics 365 (commonly referred to as Classic UI or legacy web client) has been the primary way users have accessed and utilized Dynamics 365 modules like Sales, Service, and Marketing. By December 1, 2020, organizations must transition from this classic UI to the new Unified Interface experience. If you haven’t made the switch – don’t panic. We’re here to explain what the Unified Interface is and what you need to do before December 1.

What is the Unified Interface?

Back in 2018, Microsoft introduced a new version of Dynamics 365 aimed at streamlining the interface and making it easier for users to find and utilize the entities they need. They call this the Dynamics 365 Unified Interface. To be clear, this is not a new Dynamics 365 – it’s actually a new layer on top of the classic web application that still uses the same forms and data you are currently using. Another big benefit? The Unified Interface is designed to give users the same experience no matter what device they are using. This means Dynamics 365 will adapt to phones, tablets, and many browsers while providing the same optimal viewing experience.

Microsoft’s big push for the Unified Interface is custom apps. The idea with custom apps is to create  apps for each business role that only show Dynamics 365 entities that are needed. The goal is to make it easier for users to navigate and complete tasks in Dynamics 365 without all the unnecessary clutter of other entities. Microsoft has out-of -the-box apps for Sales and Customer Service which are good starting points for learning how UI apps work and start turning the gears for what kinds of apps would be helpful for your users. Get more in-depth information about UI apps.

What’s happening on December 1 2020?

The December 1st, 2020 deadline is not a deprecation date. The classic UI web client has been deprecated since September 2019 meaning it is still technically supported, but no additional functionality is being introduced. After December 1st, the legacy web app will no longer be available. Organizations who are on the legacy web app on this date will automatically be transitioned to the Unified Interface experience. This may cause issues for tenants that have extensive customization or integrations with 3rd party software.

Microsoft says that they will be sending reminders and scheduling updates to organizations who have not yet made the move to the Unified Interface. These notifications are most likely being sent to CRM admins so check with your admin to see if they have received anything. Microsoft has helpful content and documentation related to the transition to the Unified Interface including this comprehensive whitepaper covering getting started and design best practices.

End of Support for Dynamics 365 Add-in

The Dynamics 365 add-in for Outlook will be deprecated on October 1st, 2020. There is no cause for alarm here because Microsoft has already released a replacement that is arguably better. The Dynamics 365 App for Outlook allows Dynamics 365 users to perform tasks in Dynamics 365 without leaving the Outlook interface. Common uses include tracking emails, appointments, or meetings to Dynamics 365 records like leads, opportunities, or cases. Learn more about the Dynamics 365 App for Outlook and how to deploy it.

Get assistance with transitioning to the Dynamics 365 Unified Interface

As a certified Dynamics 365 Partner, Syvantis can help your organization transition to the Unified Interface will ensuring business processes and integrations remain intact. Schedule a call with us to get your questions answered and receive recommendations for a transition plan or system customization.

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Implementing A New CRM? Make Sure Your Data is Ready for the Move

July 14, 2020   CRM News and Info

If you’ve decided to move your CRM (Customer Relationship Management) solution to the Cloud, you might be tempted by all the space the Cloud contains; it’s infinite. But even though you can move vast amounts of data, should you?

Remember that someone has to move that data. And entities and fields in your legacy CRM must be mapped to the new system. That costs time and money. The wise move is to migrate the data you will actually need going forward.

We suggest reviewing your current data and determining what you should move to your new CRM solution.  Then you can explore what new data, not available with your legacy CRM, you will be able to capture in your new system.

Reviewing current dataxdata migration 300x300.png.pagespeed.ic.hlk84JoGUb Implementing A New CRM? Make Sure Your Data is Ready for the Move

Determining which data to forward to your new CRM involves consulting those persons or teams that were responsible for collecting that data in the first place. You don’t want to leave out information necessary for them to continue their work.

Now ask:

  • What kind of data have we been capturing? Identify any information that is routinely collected and saved but is no longer useful. If none of your teams knows why this information was collected or what value it provides, you might consider not moving it to your new CRM solution.
  • Are there data fields that will no longer be used?  For instance, if your business no longer uses fax transmissions for communications, you probably won’t need a data field for fax correspondence. If a data field hasn’t been used or updated in years, that’s a sign that your team has already moved on to more efficient methods.
  • Are we saving data in fields not designed for it? Has your team been repurposing legacy fields to store information that was never intended to reside there? If you find that a single field is storing multiple types of data used for various purposes, you can split those items into separate fields in your new CRM. Clean, concise data will be critical for making the most of your new solution’s data analytics and automation capabilities.

Identifying new data

After cleaning your existing data, ask yourself what business-critical data you haven’t been capturing in your legacy CRM. This process can improve your ability to create more targeted communications and provide more predictive analytics about your customers’ behavior, allowing you to serve them better.

To consider:

  • Customer communication mediums. Facebook and LinkedIn may not have existed when you implemented your legacy CRM, but now customers use them all the time. Use the information they share to identify their needs and interests.
  • Customer classification. You may want to capture new information about your customers in order to segment and target your communications. You may need specific fields to save, and be able to update, new types of data.
  • Industry changes. Industry regulations change, and you may need to know how they affect your customers and their likelihood of needing your products or services.
  • Analytics. If manual interaction is needed to produce the reports you need, perhaps the underlying data does not support the reporting requirement.

xCopyofAdditionalSizes CRMRemarketingAds1 625x352.png.pagespeed.ic.8irC6imxri Implementing A New CRM? Make Sure Your Data is Ready for the Move

Planning for your migration

Now that you’ve identified the data you want to migrate and the fields you need, it’s time to plan your migration.

Here’s how to get started:

  • Identify which entities and tables should be migrated and document their properties such as data type, size, and any validation or special formatting.
  • Determine any data gaps and double-check that all entities and fields in your legacy system exist in your new system.
  • Determine whether you will migrate a complete set of historical data or a subset created or updated within the past few years.
  • Map legacy users by identifying all existing record owners or assigning a default user. If you have records owned by a former employee, you should decide to whom those records will be assigned going forward.
  • Consider deactivating workflows to avoid sending unintended notifications on record creation, overwriting data, or extending the time it takes to process the data.
  • Disable audit logs and autonumbering; once the migration is complete, these features can be enabled.

By taking a comprehensive approach to examining your data needs, you will lay the foundation to benefit from your CRM while preventing costly and timely data issues. Integration tools such as Tibco Scribe and eOne Smart Connect provide adapters for many systems, including Microsoft Dynamics 365, that can aid in your data migration efforts. Finally, understand data dependencies to determine the order in which data should be migrated. For instance, contact records typically are linked to account records, and it is essential to load account records first so that when migrating contacts, you can specify and populate the parent account field.

Microsoft Gold-certified partner BroadPoint has more than 20 years of experience working with companies to implement Microsoft CRM solutions such as Dynamics 365. Our team of project management professionals, technology consultants, and business partners understands that effective data management and migration is integral to your organization’s new Cloud CRM implementation success. Contact us today to learn more.

By BroadPoint, www.broadpoint.net

Let’s block ads! (Why?)

CRM Software Blog | Dynamics 365

Read More

Google’s AI teaches robots how to move by watching dogs

April 3, 2020   Big Data

Google researchers developed an AI system that learns from the motions of animals to give robots greater agility, reveals a preprint paper and blog post published this week. The coauthors believe their approach could bolster the development of robots that can complete tasks in the real world, for instance transporting materials between multilevel warehouses and fulfillment centers.

The teams’ framework takes a motion capture clip of an animal — a dog, in this case — and uses reinforcement learning, a training technique that spurs software agents to complete goals via rewards, to train a control policy. Providing the system with different reference motions enabled the researchers to “teach” a four-legged Unitree Laikago robot to perform a range of behaviors, they say, from fast walking (at a speed of up to 2.6 miles per hour) to hops and turns.

To validate their approach, the researchers first compiled a data set of real dogs performing various skills. (Training largely took place in a physics simulation so that the pose of the reference motions could be closely tracked.) Then, by using the different motions in the reward function (which describes how agents out to behave), the researchers trained with about 200 million samples a simulated robot to imitate motion skills.

 Google’s AI teaches robots how to move by watching dogs

But simulators generally provide only a coarse approximation of the real world. To address this, the researchers employed an adaptation technique that randomized the dynamics in the simulation, for example varying physical quantities like the robot’s mass and friction. These values were mapped using an encoder to a numerical representation — i.e., an encoding — which was passed as an input to the robot control policy. When deploying the policy to a real robot, the researchers removed the encoder and searched directly for a set of variables that allowed the robot to successfully execute skills.

 Google’s AI teaches robots how to move by watching dogs

The team says that they were able to adapt a policy to the real world using less than 8 minutes of real-world data across approximately 50 trials. Moreover, they demonstrated that the real-world robot learned to imitate various motions from a dog, including pacing and trotting, as well as artist-animated keyframe motions like a dynamic hop-turn.

“We show that by leveraging reference motion data, a single learning-based approach is able to automatically synthesize controllers for a diverse repertoire [of] behaviors for legged robots,” wrote the coauthors in the paper. “By incorporating sample efficient domain adaptation techniques into the training process, our system is able to learn adaptive policies in simulation that can then be quickly adapted for real-world deployment.”

 Google’s AI teaches robots how to move by watching dogs

The control policy wasn’t perfect — owing to algorithmic and hardware limitations, it couldn’t learn highly dynamic behaviors like large jumps and runs and it wasn’t as stable as the best manually-designed controllers. (In 5 episodes for a total of 15 trials per method, the real-world robot fell on average after 6 seconds while pacing; after 5 seconds while backward trotting; 9 seconds while spinning; and 10 seconds while hop-turning.) The researchers leave to future work improving the robustness of the controller and developing frameworks that can learn from other sources of motion data, such as video clips.

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

AI Weekly: Autonomous cars need better safety metrics to move the industry forward

January 10, 2020   Big Data
 AI Weekly: Autonomous cars need better safety metrics to move the industry forward

On Monday, Waymo — the subsidiary of Google parent company Alphabet that’s developing a full-stack driverless vehicle platform — announced that its cars have driven a combined 20 million autonomous miles to date, up from 10 million miles in October 2018. The metric signifies Waymo’s logistical and technological superiority, implied CEO John Krafcik, who equated the miles driven to 1,400 years of driving experience for an average American.

But some experts assert that measuring driverless systems’ progress by miles is a flawed approach.

This week in a conversation with VentureBeat at the 2020 Consumer Electronics Show (CES), Dmitry Polishchuk, the head of Russian tech giant Yandex’s autonomous car project, said that miles aren’t very meaningful without context to accompany them. “It’s tough to directly compare miles driven,” he said. “Obviously, the more miles [you] have, the better, but we believe that the environments that you’re in have a huge impact.”

Yandex isn’t without a horse in the race — its over 100 autonomous cars in Innopolis and Skolkovo, Russia; Las Vegas; and Tel Aviv have driven 1.75 million miles as of January, up from 1.5 million miles and 1 million miles last December and October, respectively. But policymakers as well as competitors in the nearly $ 41.25 billion global autonomous car segment have expressed similar sentiments.

Noah Zych, head of system safety at Uber’s Advanced Technologies Group, told Wired in an interview that mileage critically omits details like situations encountered, obstacles, and accidents. “You need to know … ‘What was the objective of the testing in [any given area]?” he said. “Was it to collect data? Was it to prove that the system was able to handle those scenarios? Or was it to just run a number up?”

And at a conference organized by Nvidia in Washington two years ago, Derek Kan, U.S. secretary for policy at the U.S. Department of Transportation, stressed the need for objective and agreed-upon measures of driverless systems performance. Separately, David Friedman, former acting administrator of the National Highway Traffic Safety Administration (NHTSA) and vice president at Consumer Reports, recently urged Congress to direct the NHTSA to implement privacy protections, minimum performance standards, and accessibility rules for self-driving cars, trucks, SUVs, and crossovers.

Disengagements — or deactivations of cars’ autonomous modes when failures occur or when drivers are forced to take over — have been adopted by agencies including California’s Department of Motor Vehicles as an alternative to miles driven. (By law, companies actively testing self-driving cars on public roads in the state are required to publish disengagement reports.) But Polishchuk argues that this, too, is an imperfect metric.

“We have kind of been waiting for some sort of industry standard,” he said, noting that Yandex hasn’t yet released a disengagement report. “Self-driving companies aren’t following the exact same protocols for things. [For example, there might be a] disengagement because there’s something blocking the right lane or a car in the right lane, and [the safety driver realizes] as a human that [this object or car] isn’t going to move.”

For its part, whenever Yandex deploys new code into production, the company conducts real-world tests to ensure that systems performance (and by extension, safety) isn’t degraded. It takes 10 cars — five equipped with the codebase from half a year ago and five with the latest code — and it runs them for a day on the same route such that they encounter identical obstacles and weather conditions. It even switches up the safety drivers behind the wheel to prevent bias from influencing the results.

“We look back at the numbers and check the correlation … using hundreds of different parameters,” said Polishchuk. “The absolute number of disengagements doesn’t matter.”

Unfortunately for companies like Yandex, less regulatory guidance — not more — seems the likelier near-future path, at least in the U.S. At CES on Wednesday, Transportation Secretary Elaine Chao announced Automated Vehicles 4.0 (AV 4.0), new guidelines regarding self-driving cars that seek to promote “voluntary consensus standards” among autonomous vehicle developers. It requests but doesn’t mandate regular assessments on self-driving vehicle safety, and it permits those assessments to be completed by automakers themselves as opposed to by a standards body.

Advocacy groups including the Advocates for Highway and Auto Safety criticized the policy for its vagueness. “Without strong leadership and regulations … [autonomous vehicle] manufacturers can and will continue to introduce extremely complex supercomputers-on-wheels onto public roads … with meager government oversight,” said president Cathy Chase in a satatement. “Voluntary guidelines are completely unenforceable, will not result in adequate performance standards, and fall well short of the safeguards that are necessary to protect the public.”

Indeed, regulation could go a long way to convincing a skeptical public.

Two studies — one published by the Brookings Institution and another by the Advocates for Highway and Auto Safety (AHAS) — found that a majority of Americans aren’t convinced of driverless cars’ safety. More than 60% of respondents to the Brookings poll said that they weren’t inclined to ride in self-driving cars, and almost 70% of those surveyed by the AHAS expressed concerns about sharing the road with them. Elsewhere, a study conducted by think tank HNTB found that 59% of people expect self-driving cars will be “no safer” than cars driven by humans.

In the U.S., legislation remains stalled at the federal level, unfortunately. More than a year ago, the House unanimously passed the SELF DRIVE Act, which would create a regulatory framework for autonomous vehicles. But it has yet to be taken up by the Senate, which in 2018 tabled a separate bill, the AV START Act, that made its way through committee in November 2017.

Polishchuk predicts that legislation will only emerge when some “reasonable amount” of self-driving cars hit public roads. Optimistic projections peg the number at 10 million by 2030. “When this happens, we would have statistics, and basically, statistics will push regulators,” he said.

For AI coverage, send news tips to Khari Johnson and Kyle Wiggers and AI editor Seth Colaner — and be sure to subscribe to the AI Weekly newsletter and bookmark our AI Channel.

Thanks for reading,

Kyle Wiggers

AI Staff Writer

Let’s block ads! (Why?)

Big Data – VentureBeat

Read More

To More Efficiently Process IoT Data, Move Your Wheelbarrow Closer to Where You Dig

November 8, 2019   TIBCO Spotfire
TIBCOWheelbarrow e1573071101557 696x365 To More Efficiently Process IoT Data, Move Your Wheelbarrow Closer to Where You Dig

Some of the easiest, most powerful lessons we learn when we are young seem obvious—usually after we learn them.  My dad taught me one the first year he let me help him plant his garden when I was a kid. Early one spring, he gave me a shovel and a wheelbarrow and said, “Go dig a row of holes and dump all the dirt in the wheelbarrow.”

So, naturally, I placed the wheelbarrow at the far edge of the garden, put my shovel in the ground at the opposite end, walked the scooped dirt all the way over to the wheelbarrow, and then walked all the way back again to my row of holes.  

My dad watched me do this a few times (laughing under his breath I’m sure) until he finally said, “Hey Kevin, don’t you think you’d get the job done faster if you moved your wheelbarrow closer to your holes?”  Brilliant.

Lost opportunities due to latency in IoT processing 

Believe it or not, this simple lesson can also be applied to the processing of data from your IoT devices. TIBCO—an innovator in microservices tooling—had the opportunity to present this lesson at this year’s Microservices World event as part of a breakout session.

The modern version of the lesson goes like this: Typically, IoT devices are connected directly to an edge device within a local network. The IoT devices send data they collect through the edge devices and up to an IoT platform hosted within a private or public cloud, such as TIBCO Cloud.  This platform processes the data—collecting, filtering, and analyzing so that appropriate action is taken. So in network terms, the IoT devices are all close to each other. But, like my wheelbarrow and the garden holes I had dug, this topology puts quite a bit of distance between IoT devices and the IoT platform.  

The effect is a high amount of latency between the time data is collected by IoT devices and the time it is acted upon—which can translate into lost opportunities if data isn’t acted upon when it is of highest value.  

How to decrease latency in IoT processing

It would seem to make sense then, as my dad would suggest, to move the processing of your IoT data closer to your IoT devices—such as on your edge devices. However, edge devices have limited compute resources (i.e. memory), which in turn limits the size (and functionality) of applications that can be deployed to them when written in traditional languages such as Java.  So instead of a wheelbarrow, imagine my dad had given me a teacup. 

That’s where the open-sourced Project Flogo can help. It enables developers to visually create event-driven microservices and functions and, most importantly, it builds application executables that are up to 50x smaller than those written in Java—perfectly scaled-down for (teacup-sized) edge devices and new types of serverless environments such as Function-as-a-Services (FaaS).  Although small in size, these applications can embed automated decision making with contextual rules and/or machine learning right into the edge device. With Project Flogo, you can process your IoT data more quickly and cost-effectively on edge devices (which are much closer to the source) rather than the cloud. You can see how through a series of short videos.

TIBCO has commercialized Project Flogo within TIBCO Cloud Integration, our market-leading enterprise Integration Platform-as-a-Service (iPaaS).  TIBCO Cloud Integration not only includes commercial technical support and a wide variety of enterprise connectors, it also includes many productivity features that simplify and accelerate event-driven application development.  You can try TIBCO Cloud Integration for a limited time to create applications in a development environment within TIBCO Cloud at no charge.

When you have the right tools, and you move them close to where work is done, you can respond with speed to valuable digital moments.  And you can plant gardens with your dad too!

Let’s block ads! (Why?)

The TIBCO Blog

Read More
« Older posts
  • Recent Posts

    • Dapper Duo
    • AI Weekly: These researchers are improving AI’s ability to understand different accents
    • Why Choose RapidMiner for Your Data Science & Machine Learning Software?
    • How to Use CRM Integration to Your Advantage – Real World Examples
    • WATCH: ‘Coming 2 America’ Movie Review Available On Amazon Prime & Amazon
  • Categories

  • Archives

    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
© 2021 Business Intelligence Info
Power BI Training | G Com Solutions Limited