Tag Archive | "Data"

Michael Dell Predicts in 10 Years More Computed Data on the Edge Than Cloud

“The surprise outcome ten years from now is there’ll be something much bigger than the private cloud and the public cloud,” says Dell Technologies CEO Michael Dell. “It’s the edge. I actually think there will be way more computed data on the edge in ten years than any of the derivatives of cloud that we want to talk about. That’s the ten-year prediction.”

Michael Dell, Chairman and CEO of Dell Technologies, discusses how it has become a critical technology platform for its customers in an interview with theCUBE at Dell Technology World 2019 in Las Vegas:

Data Has Always Been at the Center of How the Technology Industry Works

We feel great. Our business has really grown tremendously. All the things we’ve been doing have been resonating with customers. We’ve been able to restore the origins of the entrepreneurial dream and success of the company and reintroduce innovation and risk-taking into a now $ 91 billion company growing at double digits last year. Certainly, the set of capabilities that we’ve been able to build organically and inorganically, with the set of alliances we have, the trust that customers have given us, we are super happy about the position that we’re in and the opportunities going forward. I think all this is really just a pregame show to what’s ahead for our industry and for the role that technology is going to play in the world.

Data has always been at the center of how the technology industry works. Now we just have a tsunami, an explosion of data. Of course, now we have this new computer science that allows us to reason over the data in real time and create much better results and outcomes. That combined with the computing power all organizations have to reimagine themselves given all these technologies. Certainly, the infrastructure requirements in terms of the network, the storage, that compute, the build-out on the edge, tons of new requirements, we’re super well-positioned to go address all that.

Predicts in 10 Years More Computed Data on the Edge Than Cloud

The surprise outcome ten years from now is there’ll be something much bigger than the private cloud and the public cloud. It’s the edge. I actually think there will be way more computed data on the edge in ten years than any of the derivatives of cloud that we want to talk about. That’s the ten-year prediction. That’s what I see. Maybe nobody’s predicting that just yet, but let’s come back in ten years and see what it looks like.

Really what we’re doing is we’re bringing to customers all the resources they need to operate in the hybrid multi-cloud world. First, you have to recognize that the workloads want to move around. To say that they’re all going to be here or there is in some sense missing the point because they’re going to move back and forth. You’ve got regulation, cost, security, performance, latency, all sorts of new requirements that are coming at you and they’re not going to just sit in one place.

This is All Super Important As We Enter This AI Enabled Age

Now with the VMware cloud foundation, we have the ability to move these workloads seamlessly across now essentially all the public clouds. We have 4,200 partners out there, infrastructure on-premise built and tuned specifically for the VMware platform and empowered also for the edge. All of this together is the Dell Technologies cloud. We have obviously great capabilities from our Dell UMC infrastructure solutions and all the great innovations at VMware coming together.

Inside the business, the first priority was to get each of the individual pieces working well. But then we saw that the real opportunity was in the seams and how we could more deeply integrate all the aspects of what we’re doing together. You saw that on stage you know in vivid form yesterday with Pat and Jeff and Satya and even more today. Of course, there’s more to do. There’s always more to do. We’re working on how we build a data platform bringing together all of our capabilities with Boomi and Data Protection and VMware. This is all going to be super important as we enter this AI enabled age of the future.

We’ve Created an Incredible Business

I think investors are increasingly understanding that we’ve created an incredible business here. Certainly, if we look at the additional coverage that we have as they’re understanding the business, some of the analysts are starting to say hey this doesn’t really feel like a conglomerate. It’s a direct quote. If you think about what we demonstrated today and yesterday and will demonstrate in the future we’re not like Berkshire Hathaway. This is not a railroad that owns a chain of restaurants. This is one integrated business that fits together incredibly well and it’s generating substantial cash flows.

I think investors over time are figuring out the value that’s intrinsic to the overall Dell Technologies family. We’ve got lots of ways to invest, we got VMware, SecureWorks, Pivotal, and of course the overall Dell Technologies.

Michael Dell Predicts in 10 Years More Computed Data on the Edge Than Cloud

The post Michael Dell Predicts in 10 Years More Computed Data on the Edge Than Cloud appeared first on WebProNews.

WebProNews

Posted in Latest NewsComments Off

Google Ads store visits, store sales reporting data partially corrected

Google says it is making progress, but there are still days for which reporting is inaccurate.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

Huge Volume of IoT Data Managed via AI Creates Real Value, Says Oracle VP

“What’s interesting is that IoT has been around for a long time but as companies start to enable it and start to leverage it more and more there’s just huge volumes of data that have to be managed and be able to analyze and be able to execute from,” says John Barcus, Vice President Manufacturing Industries at Oracle. “One of the technologies that is really exciting is this whole concept of AI. It really allows you to use that information and correlate it with a lot of different pieces of information.”

John Barcus, Vice President Manufacturing Industries at Oracle, discusses how technologies such as AI and blockchain are now helping companies manage huge volumes of IoT data in an interview with technology influencer Ronald van Loon:

Companies Are Moving Toward Selling Products as a Service

I think that (manufacturers connecting all the processes digitally) is the way that will differentiate them. It’s really the only way the companies will be able to survive into the future. There are all these business models and it has become significantly more competitive than it has been in the past. Companies have to work faster and they have to be more responsive to what their customer needs are. The only way really of doing that is to connect the various aspects of the business. They can’t work in silos anymore. That really will give you the whole value of the business.

One area that companies are moving away from is selling products. They’re going into selling more services which we’ve actually seen for some time. But what they’re now getting into is these new models where they might be selling products as a service. If you think about how do you sell a product as a service and the ability to support that it is a lot different than it was before. Connecting to that product and being able to anticipate activities, anticipate needs, anticipate failures, and to be able to monitor how it’s performing, how the customers use it and are able to expand on that to be able to provide a better outcome for the customer are important components.

Huge Volume of IoT Data Managed via AI Creates Real Value

What’s interesting is that IoT has been around for a long time but as companies start to enable it and start to leverage it more and more there’s just huge volumes of data that have to be managed and be able to analyze and be able to execute from. One of the technologies that is really exciting is this whole concept of AI. It really allows you to use that information and correlate it with a lot of different pieces of information. You can correlate with the data that might be in your ERP and your MES and other sources of information and actually provide some real value and provide the real outcomes. It can now do some predictions where it would be actually physically impossible for people to do the same type of calculations that they’ve been doing in the past with this huge volume today.

The second area where there seems to be a little hesitation at the moment is around blockchain. But the technology is there and people have been trying to identify how best to use it. Some of the use cases that are coming out now are going to be quite impressive. I think the little bit of a lull was deserved. People who looked at it anticipate a little bit more than what was possible and now they’re really starting to develop some good use cases. I think there’s a lot of opportunities in that area.

Huge Volume of IoT Data Managed via AI Creates Real Value, Says Oracle VP John Barcus

The post Huge Volume of IoT Data Managed via AI Creates Real Value, Says Oracle VP appeared first on WebProNews.

WebProNews

Posted in Latest NewsComments Off

All Google Ads attribution reports will soon include cross-device conversion data

Make note, the change takes effect May 1.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Related Articles

Posted in Latest NewsComments Off

LinkedIn taps Bing search data for interest targeting

Advertisers can reach audiences based on business-oriented content they’ve engaged with on Bing.



Please visit Search Engine Land for the full article.


Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Posted in Latest NewsComments Off

How Palo Alto Networks Blocks 30,000 New Pieces of Malware Daily Via AI, Machine Learning, and Big Data

“The platform we have uses big data analytics and machine learning in the cloud to process and find all of the unknown malware, make it known and be able to block it,” says Scott Stevens, SVP, Global  Systems Engineering at Palo Alto Networks. “We find 20-30 thousand brand new pieces of malware every day. We’re analyzing millions and millions of files every day to figure out which ones are malicious. Once we know, within five minutes we’re updating the security posture for all of our connected security devices globally.”

Scott Stevens, SVP, Global  Systems Engineering at Palo Alto Networks, discusses how the company uses AI, machine learning, and big data to find and block malware for its customers in an interview with Jeff Frick of theCUBE which is covering RSA Conference 2019 in San Francisco:

We Find 20-30 Thousand New Pieces of Malware Every Day

There are two ways to think about artificial intelligence, machine learning, and big data analytics. The first is if we’re looking at how are we dealing with malware and finding unknown malware and blocking it, we’ve been doing that for years. The platform we have uses big data analytics and machine learning in the cloud to process and find all of the unknown malware, make it known and be able to block it.

We find 20-30 thousand brand new pieces of malware every day. We’re analyzing millions and millions of files every day to figure out which ones are malicious. Once we know, within five minutes we’re updating the security posture for all of our connected security devices globally.

Whether it’s endpoint software or it’s our inline next gen firewalls we’re updating all of our signatures so that the unknown is now known and the known can be blocked. That’s whether we’re watching to block the malware coming in or the command-and-control that’s using via DNS and URL to communicate and start whatever it’s going to do. You mentioned crypto lockers and there are all kinds of things that can happen. That’s one vector of using AI NML to prevent the ability for these attacks to succeed.

Machine Learning Uses Data Lake to Discover Malware

The other side of it is how do we then take some of the knowledge and the lessons we’ve learned for what we’ve been doing now for many years in discovering malware and apply that same AI NML locally to that customer so that they can detect very creative attacks very and evasive attacks or that insider threat that employee who’s behaving inappropriately but quietly.

We’ve announced over the last week what we call the cortex XDR set of offerings. That involves allowing the customer to build an aggregated data lake which uses the Zero Trust framework which tells us how to segment and also puts sensors in all the places of the network. This includes both network sensors an endpoint as we look at security the endpoint as well as the network links. Using those together we’re able to stitch those logs together in a data lake that machine learning can now be applied to on a customer by customer basis.

Maybe somebody was able to evade because they’re very creative or that insider threat again who isn’t breaking security rules but they’re being evasive. We can now find them through machine learning. The cool thing about Zero Trust is the prevention architecture that we needed for Zero Trust becomes the sensor architecture for this machine learning engine. You get dual purpose use out of the architecture of Zero Trust to solve both the in-line prevention and the response architecture that you need.

How Palo Alto Networks Blocks 30,000 New Pieces of Malware Daily

>> Read a companion piece to this article here:

Zero Trust Focuses On the Data That’s Key to Your Business

The post How Palo Alto Networks Blocks 30,000 New Pieces of Malware Daily Via AI, Machine Learning, and Big Data appeared first on WebProNews.

WebProNews

Posted in Latest NewsComments Off

Zero Trust Focuses On the Data That’s Key to Your Business

“The fundamental way you look at Zero Trust is it’s an architectural approach to how do you secure your network focused on what’s most important,” says Scott Stevens, SVP, Global  Systems Engineering at Palo Alto Networks. “You focus on the data that’s key to your business. You build your security framework from the data out.”

Scott Stevens, SVP, Global  Systems Engineering at Palo Alto Networks, discusses Zero Trust in an interview with Jeff Frick of theCUBE which is covering RSA Conference 2019 in San Francisco:

Zero Trust Focuses On the Data That’s Key to Your Business

We’ve been working with Forrester for about six years now looking at Zero Trust architecture. The fundamental way you look at Zero Trust is it’s a an architectural approach to how do you secure your network focused on what’s most important. You focus on the data that’s key to your business. You build your security framework from the data out. There are all kinds of buzzword bingo we can play about what Zero Trust means, but what it allows us to do is to create the right segmentation strategy starting in the data center of the cloud and moving back towards those accessing the data and how you segment and control that traffic.

Fundamentally what we’re dealing with in security are two big problems that we have. First are credential based attacks. Do we have somebody with stolen credentials in the network stealing our data? Or do we have an insider who has credentials but they’re malicious where they’re actually stealing content from the company? The second big problem is software based attacks, malware exploits scripts. How do we segment the network where we can enforce user behavior and we can watch for malicious software so we can prevent both of those occurrences through one architectural framework? I think Zero Trust gives us that template building block on how we build out those networks because everybody’s enterprise network is a little bit different.

You Need To Start With What’s Most Important.

We have to build those things together. On the Palo Alto Networks side what we do is Layer 7 enforcement based on identity. Based on who the user is and what their rights are we are able to control what they are allowed access to or what they’re not allowed access to. Of course, if you’ve got a malicious insider or somebody that’s logged in with stolen credentials we can prevent them from doing what they’re not allowed to do. Working here with Forescout, we’ve done a lot of really good integration with them on that identity mapping construct. They help us understand all the identities and all the devices in the network so we can then map that to that user posture and control at Layer 7 what they’re allowed to do or not allowed to do.

You need to start with what’s most important. Clouds and data centers as a starting point are generally the same. How we segment is actually the same. Sometimes we think that clouds are are more difficult to secure than data centers, but they are the same basically. We’ve got north-south traffic, we have east-west traffic. How do we inspect and how do we segment that? How do we focus on what’s the most important critical data to their business? If we stratify their data sets and their applications that access that data and then move down we may have 50 percent of the applications in their cloud or data center that we don’t micro segment at all because they’re not critical to the business. They’re useful to the employees, but if something goes wrong they’re, no big deal and no impact to the business.

Micro segmentation isn’t just a conversation of where we have to do things but it’s a conversation contextually in terms of what’s relevant and where is it important to do that and then where do you do a much less robust job? You always have to have inspection and visibility, but there are parts of your network where you’re going to be somewhat passive about it and there are parts of your network that you are going to be very aggressive. These include multi-factor authentication, tight user identity mapping, how do we watch for malware, how do we watch for exploits, all of the different aspects.

Zero Trust Focuses On the Data That’s Key to Your Business

>> Read a companion piece to this article here:

How Palo Alto Networks Blocks 30,000 New Pieces of Malware Daily Via AI, Machine Learning, and Big Data

The post Zero Trust Focuses On the Data That’s Key to Your Business appeared first on WebProNews.

WebProNews

Posted in Latest NewsComments Off

Make sense of your data with these essential keyword segments

Posted by TheMozTeam

This blog post was originally published on the STAT blog.


The first step to getting the most out of your SERP data is smart keyword segmentation — it surfaces targeted insights that will help you make data-driven decisions.

But knowing what to segment can feel daunting, especially when you’re working with thousands of keywords. That’s why we’re arming you with a handful of must-have tags.

Follow along as we walk through the different kinds of segments in STAT, how to create them, and which tags you’ll want to get started with. You’ll be a fanciful segment connoisseur by the time we’re through!

Segmentation in STAT

In STAT, keyword segments are called “tags” and come as two different types: standard or dynamic.

Standard tags are best used when you want to keep specific keywords grouped together because of shared characteristics — like term (brand, product type, etc), location, or device. Standard tags are static, so the keywords that populate those segments won’t change unless you manually add or remove them.

Dynamic tags, on the other hand, are a fancier kind of tag based on filter criteria. Just like a smart playlist, dynamic tags automatically populate with all of the keywords that meet said criteria, such as keywords with a search volume over 500 that rank on page one. This means that the keywords in a dynamic tag aren’t forever — they’ll filter in and out depending on the criteria you’ve set.

How to create a keyword segment

Tags are created in a few easy steps. At the Site level, pop over to the Keywords tab, click the down arrow on any table column header, and then select Filter keywords. From there, you can select the pre-populated options or enter your own metrics for a choose-your-own-filter adventure.

Once your filters are in place, simply click Tag All Filtered Keywords, enter a new tag name, and then pick the tag type best suited to your needs — standard or dynamic — and voila! You’ve created your very own segment.

Segments to get you started

Now that you know how to set up a tag, it’s time to explore some of the different segments you can implement and the filter criteria you’ll need to apply.

Rank and rank movement

Tracking your rank and ranking movements with dynamic tags will give you eyeballs on your keyword performance, making it easy to monitor and report on current and historical trends.

There’s a boatload of rank segments you can set up, but here’s just a sampling to get you started:

  • Keywords ranking in position 1–3; this will identify your top performing keywords.
  • Keywords ranking in position 11–15; this will suss out the low-hanging, top of page two fruit in need of a little nudge.
  • Keywords with a rank change of 10 or more (in either direction); this will show you keywords that are slipping off or shooting up the SERP.

Appearance and ownership of SERP features

Whether they’re images, carousels, or news results, SERP features have significantly altered the search landscape. Sometimes they push you down the page and other times, like when you manage to snag one, they can give you a serious leg up on the competition and drive loads more traffic to your site.

Whatever industry-related SERP features that you want to keep apprised of, you can create dynamic tags that show you the prevalence and movement of them within your keyword set. Segment even further for tags that show which keywords own those features and which have fallen short.

Below are a few segments you can set up for featured snippets and local packs.

Featured snippets

Everyone’s favourite SERP feature isn’t going anywhere anytime soon, so it wouldn’t be a bad idea to outfit yourself with a snippet tracking strategy. You can create as many tags as there are snippet options to choose from:

  • Keywords with a featured snippet.
  • Keywords with a paragraph, list, table, and/or carousel snippet.
  • Keywords with an owned paragraph, list, table, and/or carousel snippet.
  • Keywords with an unowned paragraph, list, table, and/or carousel snippet.

The first two will allow you to see over-arching snippet trends, while the last two will chart your ownership progress.

If you want to know the URL that’s won you a snippet, just take a peek at the URL column.

Local packs

If you’re a brick and mortar business, we highly advise creating tags for local packs since they provide a huge opportunity for exposure. These two tags will show you which local packs you have a presence in and which you need to work on

  • Keywords with an owned local pack.
  • Keywords with an unowned local pack.

Want all the juicy data squeezed into a local pack, like who’s showing up and with what URL? We created the Local pack report just for that.

Landing pages, subdomains, and other important URLs

Whether you’re adding new content or implementing link-building strategies around subdomains and landing pages, dynamic tags allow you to track and measure page performance, see whether your searchers are ending up on the pages you want, and match increases in page traffic with specific keywords.

For example, are your informational intent keywords driving traffic to your product pages instead of your blog? To check, a tag that includes your blog URL will pull in each post that ranks for one of your keywords.

Try these three dynamic tags for starters:

  • Keywords ranking for a landing page URL.
  • Keywords ranking for a subdomain URL.
  • Keywords ranking for a blog URL.

Is a page not indexed yet? That’s okay. You can still create a dynamic tag for its URL and keywords will start appearing in that segment when Google finally gets to it.

Location, location, location

Google cares a lot about location and so should you, which is why keyword segments centred around location are essential. You can tag in two ways: by geo-modifier and by geo-location.

For these, it’s better to go with the standard tag as the search term and location are fixed to the keyword.

Geo-modifier

A geo-modifier is the geographical qualifier that searchers manually include in their query — like in [sushi near me]. We advocate for adding various geo-modifiers to your keywords and then incorporating them into your tagging strategy. For instance, you can segment by:

  • Keywords with “in [city]” in them.
  • Keywords with “near me” in them.

The former will show you how you fare for city-wide searches, while the latter will let you see if you’re meeting the needs of searchers looking for nearby options.


Geo-location

Geo-location is where the keyword is being tracked. More tracked locations mean more searchers’ SERPs to sample. And the closer you can get to searchers standing on a street corner, the more accurate those SERPs will be. This is why we strongly recommend you track in multiple pin-point locations in every market you serve.

Once you’ve got your tracking strategy in place, get your segmentation on. You can filter and tag by:

  • Keywords tracked in specific locations; this will let you keep tabs on geographical trends.
  • Keywords tracked in each market; this will allow for market-level research.

Search volume & cost-per-click

Search volume might be a contentious metric thanks to Google’s close variants, but having a decent idea of what it’s up to is better than a complete shot in the dark. We suggest at least two dynamic segments around search volume:

  • Keywords with high search volume; this will show which queries are popular in your industry and have the potential to drive the most traffic.
  • Keywords with low search volume; this can actually help reveal conversion opportunities — remember, long-tail keywords typically have lower search volumes but higher conversion rates.

Tracking the cost-per-click of your keywords will also bring you and your PPC team tonnes of valuable insights — you’ll know if you’re holding the top organic spot for an outrageously high CPC keyword.

As with search volume, tags for high and low CPC should do you just fine. High CPC keywords will show you where the competition is the fiercest, while low CPC keywords will surface your easiest point of entry into the paid game — queries you can optimize for with less of a fight.

Device type

From screen size to indexing, desktop and smartphones produce substantially different SERPs from one another, making it essential to track them separately. So, filter and tag for:

  • Keywords tracked on a desktop.
  • Keywords tracked on a smartphone.

Similar to your location segments, it’s best to use the standard tag here.

Go crazy with multiple filters

We’ve shown you some really high-level segments, but you can actually filter down your keywords even further. In other words, you can get extra fancy and add multiple filters to a single tag. Go as far as high search volume, branded keywords triggering paragraph featured snippets that you own for smartphone searchers in the downtown core. Phew!

Want to make talk shop about segmentation or see dynamic tags in action? Say hello (don’t be shy) and request a demo.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

4 Ways to Improve Your Data Hygiene – Whiteboard Friday

Posted by DiTomaso

We base so much of our livelihood on good data, but managing that data properly is a task in and of itself. In this week’s Whiteboard Friday, Dana DiTomaso shares why you need to keep your data clean and some of the top things to watch out for.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi. My name is Dana DiTomaso. I am President and partner at Kick Point. We’re a digital marketing agency, based in the frozen north of Edmonton, Alberta. So today I’m going to be talking to you about data hygiene.

What I mean by that is the stuff that we see every single time we start working with a new client this stuff is always messed up. Sometimes it’s one of these four things. Sometimes it’s all four, or sometimes there are extra things. So I’m going to cover this stuff today in the hopes that perhaps the next time we get a profile from someone it is not quite as bad, or if you look at these things and see how bad it is, definitely start sitting down and cleaning this stuff up.

1. Filters

So what we’re going to start with first are filters. By filters, I’m talking about analytics here, specifically Google Analytics. When go you into the admin of Google Analytics, there’s a section called Filters. There’s a section on the left, which is all the filters for everything in that account, and then there’s a section for each view for filters. Filters help you exclude or include specific traffic based on a set of parameters.

Filter out office, home office, and agency traffic

So usually what we’ll find is one Analytics property for your website, and it has one view, which is all website data which is the default that Analytics gives you, but then there are no filters, which means that you’re not excluding things like office traffic, your internal people visiting the website, or home office. If you have a bunch of people who work from home, get their IP addresses, exclude them from this because you don’t necessarily want your internal traffic mucking up things like conversions, especially if you’re doing stuff like checking your own forms.

You haven’t had a lead in a while and maybe you fill out the form to make sure it’s working. You don’t want that coming in as a conversion and then screwing up your data, especially if you’re a low-volume website. If you have a million hits a day, then maybe this isn’t a problem for you. But if you’re like the rest of us and don’t necessarily have that much traffic, something like this can be a big problem in terms of the volume of traffic you see. Then agency traffic as well.

So agencies, please make sure that you’re filtering out your own traffic. Again things like your web developer, some contractor you worked with briefly, really make sure you’re filtering out all that stuff because you don’t want that polluting your main profile.

Create a test and staging view

The other thing that I recommend is creating what we call a test and staging view. Usually in our Analytics profiles, we’ll have three different views. One we call master, and that’s the view that has all these filters applied to it.

So you’re only seeing the traffic that isn’t you. It’s the customers, people visiting your website, the real people, not your office people. Then the second view we call test and staging. So this is just your staging server, which is really nice. For example, if you have a different URL for your staging server, which you should, then you can just include that traffic. Then if you’re making enhancements to the site or you upgraded your WordPress instance and you want to make sure that your goals are still firing correctly, you can do all that and see that it’s working in the test and staging view without polluting your main view.

Test on a second property

That’s really helpful. Then the third thing is make sure to test on a second property. This is easy to do with Google Tag Manager. What we’ll have set up in most of our Google Tag Manager accounts is we’ll have our usual analytics and most of the stuff goes to there. But then if we’re testing something new, like say the content consumption metric we started putting out this summer, then we want to make sure we set up a second Analytics view and we put the test, the new stuff that we’re trying over to the second Analytics property, not view.

So you have two different Analytics properties. One is your main property. This is where all the regular stuff goes. Then you have a second property, which is where you test things out, and this is really helpful to make sure that you’re not going to screw something up accidentally when you’re trying out some crazy new thing like content consumption, which can totally happen and has definitely happened as we were testing the product. You don’t want to pollute your main data with something different that you’re trying out.

So send something to a second property. You do this for websites. You always have a staging and a live. So why wouldn’t you do this for your analytics, where you have a staging and a live? So definitely consider setting up a second property.

2. Time zones

The next thing that we have a lot of problems with are time zones. Here’s what happens.

Let’s say your website, basic install of WordPress and you didn’t change the time zone in WordPress, so it’s set to UTM. That’s the default in WordPress unless you change it. So now you’ve got your data for your website saying it’s UTM. Then let’s say your marketing team is on the East Coast, so they’ve got all of their tools set to Eastern time. Then your sales team is on the West Coast, so all of their tools are set to Pacific time.

So you can end up with a situation where let’s say, for example, you’ve got a website where you’re using a form plugin for WordPress. Then when someone submits a form, it’s recorded on your website, but then that data also gets pushed over to your sales CRM. So now your website is saying that this number of leads came in on this day, because it’s in UTM mode. Well, the day ended, or it hasn’t started yet, and now you’ve got Eastern, which is when your analytics tools are recording the number of leads.

But then the third wrinkle is then you have Salesforce or HubSpot or whatever your CRM is now recording Pacific time. So that means that you’ve got this huge gap of who knows when this stuff happened, and your data will never line up. This is incredibly frustrating, especially if you’re trying to diagnose why, for example, I’m submitting a form, but I’m not seeing the lead, or if you’ve got other data hygiene issues, you can’t match up the data and that’s because you have different time zones.

So definitely check the time zones of every product you use –website, CRM, analytics, ads, all of it. If it has a time zone, pick one, stick with it. That’s your canonical time zone. It will save you so many headaches down the road, trust me.

3. Attribution

The next thing is attribution. Attribution is a whole other lecture in and of itself, beyond what I’m talking about here today.

Different tools have different ways of showing attribution

But what I find frustrating about attribution is that every tool has its own little special way of doing it. Analytics is like the last non-direct click. That’s great. Ads says, well, maybe we’ll attribute it, maybe we won’t. If you went to the site a week ago, maybe we’ll call it a view-through conversion. Who knows what they’re going to call it? Then Facebook has a completely different attribution window.

You can use a tool, such as Supermetrics, to change the attribution window. But if you don’t understand what the default attribution window is in the first place, you’re just going to make things harder for yourself. Then there’s HubSpot, which says the very first touch is what matters, and so, of course, HubSpot will never agree with Analytics and so on. Every tool has its own little special sauce and how they do attribution. So pick a source of truth.

Pick your source of truth

This is the best thing to do is just say, “You know what? I trust this tool the most.” Then that is your source of truth. Do not try to get this source of truth to match up with that source of truth. You will go insane. You do have to make sure that you are at least knowing that things like your time zones are clear so that’s all set.

Be honest about limitations

But then after that, really it’s just making sure that you’re being honest about your limitations.

Know where things are necessarily going to fall down, and that’s okay, but at least you’ve got this source of truth that you at least can trust. That’s the most important thing with attribution. Make sure to spend the time and read how each tool handles attribution so when someone comes to you and says, “Well, I see that we got 300 visits from this ad campaign, but in Facebook it says we got 6,000.

Why is that? You have an answer. That might be a little bit of an extreme example, but I mean I’ve seen weirder things with Facebook attribution versus Analytics attribution. I’ve even talked about stuff like Mixpanel and Kissmetrics. Every tool has its own little special way of recording attributions. It’s never the same as anyone else’s. We don’t have a standard in the industry of how this stuff works, so make sure you understand these pieces.

4. Interactions

Then the last thing are what I call interactions. The biggest thing that I find that people do wrong here is in Google Tag Manager it gives you a lot of rope, which you can hang yourself with if you’re not careful.

GTM interactive hits

One of the biggest things is what we call an interactive hit versus a non-interactive hit. So let’s say in Google Tag Manager you have a scroll depth.

You want to see how far down the page people scroll. At 25%, 50%, 75%, and 100%, it will send off an alert and say this is how far down they scrolled on the page. Well, the thing is that you can also make that interactive. So if somebody scrolls down the page 25%, you can say, well, that’s an interactive hit, which means that person is no longer bounced, because it’s counting an interaction, which for your setup might be great.

Gaming bounce rate

But what I’ve seen are unscrupulous agencies who come in and say if the person scrolls 2% of the way down the page, now that’s an interactive hit. Suddenly the client’s bounce rate goes down from say 80% to 3%, and they think, “Wow, this agency is amazing.” They’re not amazing. They’re lying. This is where Google Tag Manager can really manipulate your bounce rate. So be careful when you’re using interactive hits.

Absolutely, maybe it’s totally fair that if someone is reading your content, they might just read that one page and then hit the back button and go back out. It’s totally fair to use something like scroll depth or a certain piece of the content entering the user’s view port, that that would be interactive. But that doesn’t mean that everything should be interactive. So just dial it back on the interactions that you’re using, or at least make smart decisions about the interactions that you choose to use. So you can game your bounce rate for that.

Goal setup

Then goal setup as well, that’s a big problem. A lot of people by default maybe they have destination goals set up in Analytics because they don’t know how to set up event-based goals. But what we find happens is by destination goal, I mean you filled out the form, you got to a thank you page, and you’re recording views of that thank you page as goals, which yes, that’s one way to do it.

But the problem is that a lot of people, who aren’t super great at interneting, will bookmark that page or they’ll keep coming back to it again and again because maybe you put some really useful information on your thank you page, which is what you should do, except that means that people keep visiting it again and again without actually filling out the form. So now your conversion rate is all messed up because you’re basing it on destination, not on the actual action of the form being submitted.

So be careful on how you set up goals, because that can also really game the way you’re looking at your data.

Ad blockers

Ad blockers could be anywhere from 2% to 10% of your audience depending upon how technically sophisticated your visitors are. So you’ll end up in situations where you have a form fill, you have no corresponding visit to match with that form fill.

It just goes into an attribution black hole. But they did fill out the form, so at least you got their data, but you have no idea where they came from. Again, that’s going to be okay. So definitely think about the percentage of your visitors, based on you and your audience, who probably have an ad blocker installed and make sure you’re comfortable with that level of error in your data. That’s just the internet, and ad blockers are getting more and more popular.

Stuff like Apple is changing the way that they do tracking. So definitely make sure that you understand these pieces and you’re really thinking about that when you’re looking at your data. Again, these numbers may never 100% match up. That’s okay. You can’t measure everything. Sorry.

Bonus: Audit!

Then the last thing I really want you to think about — this is the bonus tip — audit regularly.

So at least once a year, go through all the different stuff that I’ve covered in this video and make sure that nothing has changed or updated, you don’t have some secret, exciting new tracking code that somebody added in and then forgot because you were trying out a trial of this product and you tossed it on, and it’s been running for a year even though the trial expired nine months ago. So definitely make sure that you’re running the stuff that you should be running and doing an audit at least on an yearly basis.

If you’re busy and you have a lot of different visitors to your website, it’s a pretty high-volume property, maybe monthly or quarterly would be a better interval, but at least once a year go through and make sure that everything that’s there is supposed to be there, because that will save you headaches when you look at trying to compare year-over-year and realize that something horrible has been going on for the last nine months and all of your data is trash. We really don’t want to have that happen.

So I hope these tips are helpful. Get to know your data a little bit better. It will like you for it. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!


Moz Blog

Posted in Latest NewsComments Off

How to Stop Drowning in Data and Begin Using Your Metrics Wisely

Digital marketers have a problem: We’ve got too much data. It sounds like a ridiculous complaint coming from a data…

The post How to Stop Drowning in Data and Begin Using Your Metrics Wisely appeared first on Copyblogger.


Copyblogger

Posted in Latest NewsComments Off

Advert