Episode 8: Gather Insight From Your Metrics (w/ Jon Crowley)
Corey and Deane talk about the first time they tracked analytics on their blogs in the early 2000s.
Then, Jon Crowley, Senior Vice President of Strategy at Diamond Marketing Group, talks to us about the balance between data and insights — how to focus on questions rather than raw numbers, how to look for answers rather than “trying to be correct,” and a when we can take data at face value. (He also gives us a tour of his shoe collection.)
The Web Project Guide podcast is sponsored by Blend Interactive, a web strategy, design, and development firm dedicated to guiding teams through complicated web and content problems, from content strategy and design to CMS implementation and support.
Show Notes and Further Discussion:
- Jon Crowley (@joncrowley)
- Diamond Marketing
- “Your Funnel Isn’t a Journey: Data vs. Insights” — Design/Content 2016
- A Benign Conspiracy
Transcript
Corey: (00:11)
Hello, this is The Web Project Guide podcast. And this is episode eight, gather insight from your metrics. I'm Corey Vilhauer director of strategy at Blend Interactive and co-author of The Web Project Guide. Later on, we'll talk to our friend, Jon Crowley, vice president of strategy at Diamond Marketing Group in Toronto. But first I'm joined by the other Web Project Guide co-author, I was going to do a different one, co-conspirator, Deane Barker, senior director of content management research at Optimizly. Hi, Deane.
Deane: (00:38)
Hi. Like you said last time, you have to be careful what you say, because since you have the prefix co you're indicting yourself into whatever you accuse me of.
Corey: (00:45)
Yeah. It really threw me off.
Deane: (00:46)
I know, right? You won't ever say like co-felon or something.
Corey: (00:51)
Deane, I have a question for you.
Deane: (00:52)
Hit me.
Corey: (00:53)
Do you remember what it was like when you first put any kind of thing that to measure traffic on your blog?
Deane: (01:02)
Totally. Oh my God. My first website I ever published in public was a James Bond FAQ and a frequently asked questions list, which are very different than they are now. A frequently asked question now are things that we make up, but back in the day, and this would've been like '97, our frequently asked questions list was literally frequently asked questions from Usenet. And I was a member of the alt.fan.James_Bond Usenet group. And we were putting together frequently asked questions list. And so the first thing I ever put online was that frequently asked questions list and it wasn't text was section HTML version, and I put a counter at the bottom of it. For people who are new to the web, here's how counters used to work. This is how counters used to work. You used to have a CGI script, little computing script that would embed an image on the page.
Deane: (01:51)
And so every time that was requested, the CGI script would return a different image for a different counter. And you used to see like ... This is hilarious because websites were so slow and dialup was so slow, you used to see notes at the top of webpage saying, "Please wait until the image counter loads," because if the image counter didn't load, if you bailed out before the minute it took to load a webpage, they wouldn't get their hit. And I remember putting that on there. And then I was at college the next day and I kept sneaking into the computer lab because none of us had laptops. You had to go to a computer lab and bringing up my website to look at the counter. And I remember when that thing first turned over, I was super excited until I realized that was me checking it. And so I realized there was no way I could check this. It was like the observer effect, right? There was no way to check it without triggering the counter and so-
Corey: (02:47)
Those counters were, they were amazing because the level of optimism that somebody like me who started a little blog on tripod back in the day to have a six digit counter. No.
Deane: (03:00)
It was crazy-
Corey: (03:01)
I mean it never got over three.
Deane: (03:02)
Today, anybody can get traffic to anything. There's so many people on the internet, you can publish [inaudible 00:03:08] but back then, the internet was sparse. And the fact is it was like ... I'm reading a book called The Covenant by James Michener, which the history of South Africa and back in the day, South African villages, there were hundreds of miles between them. And if someone happened upon your village, that was a moment of great rejoicing because you could go anywhere in South Africa and look, you managed to hit a village. And that's how it was in the internet back in the day. The fact that somebody actually came to your little part of the world and we were so enamored with the idea of I affected somebody with this, somebody I don't know and have never talked to saw my stuff. And it was relatively mind blowing and of course it was all service side tracking back then because when I got on the internet, we didn't even have JavaScript.
Corey: (03:50)
I remember the first time that I had installed whatever was Google analytics before it was really what it was now.
Deane: (03:58)
Urchin.
Corey: (03:58)
Urchin. Yeah. And I started getting a ton of traffic and I was always fascinated by seeing what people were searching to get to my little blog back in 2005, 2006. And I had this giant uptick once because I had written a blog post about how I had a really bad experience at Radio Shack. And so the title of the post was Radio Shack Sucks. And I tell you what, that's a very common search term back in the back in around 2005, 2006. And I mean, I had more traffic from that one post than ever. And it was fascinating to me to see how language led to all of this data coming in and how the psychology of somebody who's angry about something and what they did when they landed on my site. Now we're suddenly posting all this stuff and you could see it. It was weird to see it almost in real time. Because again, I think, I don't know that Urchin was live stats, but I mean you'd refresh a few times and they'd all show up this.
Deane: (04:51)
We only had one metric. It wasn't even called a pageview. It was called a hit.
Corey: (04:54)
Yeah.
Deane: (04:55)
Remember you had a hit and we only had the one metric you weren't using ... So JavaScript existed, but what came to be called Ajax, which we have all sorts of other names for now, didn't exist, like making HTTP requests out the browser, that didn't exist. So we couldn't track any other behavior. And the idea of tracking what someone did on your page, I mean, we weren't even bubbling up JavaScript events from the Dom at that point. So how many hits does your website have? That's kind of what you asked. It was just very, very crude in terms of what you could do. You could track traffic. I just remember a movie. It had Amy. It had Amy Adams and it was about Julia Child's, Julie and Julia.
Corey: (05:35)
Yeah. Julie and Julia.
Deane: (05:36)
Yeah. So it's about this lady named Julia who puts on a blog where she's doing all of Julia Child's thousand recipes and the art of French cooking. And then the voiceover, she was really excited to get my first comment. And then the first comment was from her mom saying, "Julia, why are you doing this?" And that's basically ... You would get some hits and you had to get over a certain threshold before you were sure. It wasn't just you and your buddies who were checking it out.
Corey: (06:01)
Yeah. And then we're getting dangerously into the old practice of having a guest book on your site.
Deane: (06:06)
Yes. The guest book. Yeah. Sign the guest book. That was a big deal. And the under construction images, you would have the under construction signs from road work and you would put them up there because the concept of having a page just not be published, because it wasn't ready. Every page was in sort of some state of construction. And so we had under construction signs because for sure someone was going to come to our website and click on a resource and get an under construction sign and then they were going to write down in their scheduler to come back later because it was clearly under construction. And this leads into our conversation about analytics.
Corey: (06:43)
We are going to speak later to John Crowley who is vice president of strategy at Diamond Marketing and Group in Toronto. He's also author of a benign conspiracy and occasional newsletter on marketing and brand strategy. But first this episode of the Web Project Guide podcast is brought to you by Blend Interactive, a web strategy design and development firm dedicated to building great websites, Blend's been guiding teams through complicated web and content projects for nearly 17 years. And we're always looking for our next big project, visit us at blendinteractive.com.
Corey: (07:19)
Jon, welcome to the podcast. Appreciate you coming on. First off you were I in the email when we were setting this up, you were a little worried when I asked you to be on this episode because it's not necessarily the world you're in anymore. You're not strictly a web practitioner, especially now at Diamond. I think that's exactly why you're the perfect guest for this episode because most of the people who are planning on or working on a project that has that as a part of the internal team, they probably don't consider themselves a web practitioner either. And I think back to, and the reason that I think we got connected is you did a talk that really resonated with me a few years back called Your Funnel isn't a Journey: Data versus Insights. On the web, we focus a lot on metrics and numbers and data where you've argued that the focus really should be on insights. And so if you could talk a little bit about the definitions between data and insights and kind of where you draw the line.
Jon: (08:19)
So this is actually probably the most common conversation I ever have. Not just because I work in a strategic planning role in that agency. And our favorite hobby in my specific segment of the industry is fighting over the definition of an insight. But also my wife actually runs a strategic insights group at a competing agency. And her team leans really heavily on data as means of understanding human behavior. So I stick my foot in my mouth, a lot talking about this specific subject, but I can break down my point of view in a really straightforward way. I think data is fantastic at telling you what's actually happening in the world. Whereas insight should be an attempt to explain why that thing is happening. And I think very often people collapse these two distinct these two distinct camps of thinking into a singular step.
Jon: (09:09)
And everyone, I like to blame large tech companies for all the world's problems. So I think about you open a Google Analytics dashboard and it'll tell you everything is an insight. And the insight they'll give you is 37% of the visitors who click through on this page happen to be men between the age of 18 and 34 or someone will highlight that the bounce rate on a specific page is incredibly high and that's an insight we need to find a way to action.
Jon: (09:32)
And one of my big concerns is that's not necessarily an insight. What you have is a fact, you have no idea why that fact is the case and you don't even necessarily have other facts to compare it to. So the question that I find really interesting is why are we seeing this? And what other things are we seeing that can dimensionalize or contradicted or help us understand it? And that's what helps me get to an insight, which we're to define an insight at work with my team, what I'd say is it's a newer, unique understanding that presents an opportunity. That's kind of the in-house definition we use of an insight.
Deane: (10:06)
Well, data science is the big thing right now, right? My Alma Mater in Sioux Falls, Augustana University has now a data science degree and big data is the big thing. And everybody's becoming data scientists and I maintain that we were swimming in data. Nobody knows what questions to ask. Nobody knows how to actually make sense of any data.
Jon: (10:26)
That's a fantastic point. I completely agree. I had a really fantastic conversation a couple of months ago with someone. And one of the talks that came up was the idea that data might service better as a source of questions than as a source of answers. And I think that's really uncomfortable for people because they generally come to data with a bunch of questions and are hoping the data scientist or the practitioner they're engaging with will be able to turn it into answers for them. The promise of big data as it's sold, especially to marketing professional professionals, but to anyone who's running a brand or running a company is that big data will tell you what to do. It will solve the complexity of the world and it will take away the ambiguity that might scare you a little bit. And I think very often it'll have the opposite effect, but that's not a bad thing.
Deane: (11:07)
Yeah. I mean too much data I've found tends to overwhelm and confuse people. This is my problem. Whenever I get into Google analytics, I'm not a big analytics data guy, but I get in there and I just have all of this data and have no idea what question to ask first, what is a problem that requires an answer? Like you talked about that insight most of the people coming here are males from 18 to 34. Well, what do I do with that information? How does that help me? So I just think that making the leap from data or fact to actionable thing is trickier than people realize.
Jon: (11:48)
It can be the biggest possible challenge. And I think the problem is humans are pattern recognition machines. We're constantly looking for some kind of story in whatever we look at. And very often the story we see first is the one that's most interesting. It's the same way that anything that looks like it can be confused with a human face is super exciting. And you end up seeing 400 pictures of it on Reddit, but similarly, anything that can look like a fun, actionable pattern ends up distracting from what might actually impact someone's business metrics.
Deane: (12:18)
Yeah. So there's a couple things there. This reminds me of the problem I have in science in general, that p-hacking problem is when scientists like they call it torturing the data until it gives them something. They want to write some paper on something and be known for something. So they just take a data set and just query the hell out of it until they find some weird correlation. And in fact, there was a blog years ago that used to post these weird correlations. It was a data scientist. We used to take data sets and look for the weirdest correlation. The number of inches that it rains at Portland in a summer seems to correlate to the number of doughnuts consumed by Jewish men in south Florida. It doesn't mean anything.
Jon: (12:57)
Unless ... No, it probably doesn't mean anything.
Deane: (13:02)
And so, I mean, that's what I think we do with analytics is we want the analytics to reveal something to us rather than answer existing question that we have. I remember talking to a sales prospect when I was selling for Blend. And we were talking to them about a feature of a piece of software that enables you to track customer behavior, blah, blah, blah. And look for segment customers like, "Take these customers and find all the customers that have done this, this and this," and what they wanted the software to do was suggest segments to them. They wanted the software to analyze it to them and tell them, "Well, this is a statistically significant group of people." And then my question to them was, "Well, what are you going to do about that if 70 year old Jewish men from south Florida are statistically significant segment, how does that help you in any way?
Jon: (13:47)
So this is actually fascinating. This directly references some of the work that we do with some of our current clients where they'll have a huge amount of own data and they'll want to understand the right way to reach out to those people. And what we've found is the only way to get to those answers is to use that own data and the data clustering and the behavioral tracking and grouping people based on similar behaviors, we actually end up using that to recruit, to do traditional qualitative research. So we can get a bunch of people talking in the room so we can understand why they have that similar behavior set and what's driving that duplicate behavior. And that's really counterintuitive for a lot of people. There's a belief that you have data and you have analytics. So you can avoid talking to human beings.
Jon: (14:31)
It's like your secret hack to avoid needing to listen to people, explain their problems and needing to actually engage with humans to understand things. But what we've found is the biggest path to actually being able to do something actionable is basically treat what data tells you as a starting point for research, you can start figuring out why it's showing you those things and what human thoughts and behaviors and emotional or rational drives are leading to that pattern or that behavior set. And don't get me wrong. You could guess and you could AB test it. People are doing this for one of these three reasons, we think. Let's see what they're going to respond to. But there's some things where the best approach is to try a bunch of stuff and see what works. And there's other things where maybe getting it right the first time is important enough that it's worth the extra time and investment to kind of unlock the different drivers kind of the whys behind the different things you're seeing.
Deane: (15:23)
When do you know when you found an actionable question? A question that a CEO might ask is why isn't our website driving more sales? Is that an actionable, is that a researchable question?
Jon: (15:32)
Well, I think the fun thing about questions and problem statements is they usually lead to more questions and more problems before they lead to answers. So if I was meeting with a client and they said why isn't our website converting more? We might eventually get to like, "People are abandoning carts or there's a conversion problem, three pages into your onsite funnel." And then that question leads us to why is this happening and how do we get there?
Jon: (15:58)
What's interesting about these projects is very often, you need to get real buy-in on the fact that the question you end up answering at the end might not be the question you started with at the beginning. It'll be related and it'll help you get there. But generally there's a linearity that people want from human behavior and for stories about data that doesn't actually exist. And sometimes it feels like a lot of our job in these industries is helping to apply a narrative and a linearity to the irrationality of human behavior so that we can explain it to each other and train new people on how to address it.
Deane: (16:34)
How often do people come to you and they just want you to prove a point that they already believe in? They just have confirmation bias and they don't really want to know the truth. They want you to prove their point.
Jon: (16:46)
So I would say that is probably the starting point of the vast majority of engagements that I've had with anyone in my career. And I think largely the difference between confirmation bias and a hypothesis is thinner than most people think. I think the average person has a hypothesis verging on a solution that they've decided is correct. And you learn pretty quickly, which people are very open to evidence that contradicts what they believe and which people are going to hold on until they have iron cloud proof that what they want may not work. And occasionally you meet someone who'll hold on beyond that point. But I think in my experience, it's always been beneficial to treat everyone as though whatever assumptions they're making are based on the best information they have at the time. And then just give them the best information you have as you move forward.
Jon: (17:39)
There's no easy solution to that one though. Very often and candidly, it's the thing we need to tackle internally as well. People will come to a problem with a belief of what's going on and it'll influence how they look at the data, it'll influence their supplementary research. This is why we often refer to strategies, the art of being wrong as much as possible. I kind of encourage that mindset of trying to prove yourself wrong rather than right. Because it's more likely, like as you said earlier, you can torture data to mean whatever you need it to mean. Anyone with enough experience in statistics and enough experience in research is generally capable of arguing with data supporting it, anything that they might happen to find valuable. So I find very often what you need to do is find someone who cares more about having the right answer than about being correct. And I think that is a difficult thing for people to disconnect in their minds sometime.
Deane: (18:32)
So Corey, this is episode number-
Corey: (18:34)
Six. No, eight.
Deane: (18:36)
Eight.
Corey: (18:36)
Eight.
Deane: (18:37)
Of the eight episodes. How many of those have touched on the concept of psychological or social behaviors of the actors in these projects?
Corey: (18:46)
I mean all of them and into the future, I'm sure is kind of how it works is this all ... It's a lot of people.
Deane: (18:54)
We're counselors. We're like psychologists. It seems like every conversation we have comes back to this is how you manage people and reverse engineer, what they're thinking. And Jon, you're supposed to be the data guy. This is supposed to be math, man. We're not supposed to be reverse engineering human behavior here.
Jon: (19:12)
The funny thing for me is as much as I enjoy data and information as an input to something, math has quite literally never been my strong suit. What I'm good at is problem solving. So I'm good at taking math in numbers and applying it to make sense of something. Give me pure math, I'm lost at sea.
Corey: (19:33)
I'm wondering along that idea of seeing data and analytics as sort of these complex math equations, which thankfully we have things like Google Analytics and other tools to do the math for us, I assume there has been. But what has been the change in how you sort of interpret data and analytics and understanding the insights within it due to increasing anonymity among what that data is? You don't have as much insight a lot of times as to any of the information behind it, unless you specifically ask for it. And the people who are willing to get that information are already sort of pre-selecting into a specific group. So how has that changed over the past however long you've been doing this?
Jon: (20:15)
I've definitely become more cynical over the last decade or so. And I can't tell if that's just an aging thing or an experience thing, or if it's the world slowly slipping towards dystopia, but it's been interesting going from earlier points in my career where I would look at the information shared by an analytics or an ad platform about what was performing and what wasn't as a source of objective truth versus now looking at it with the utmost skepticism. And I'm not sure if that's just too many experiences of Google saying or well, Facebook more often than Google, but saying whoopsie and then saying that they've been misreporting numbers for the last four quarters, but some of it also is just a recognition that I suppose what I've realized is numbers that aren't about something are generally not that important.
Jon: (21:02)
And there's a tendency to get really concerned with numbers that aren't actually about very much, like in my current world, you'll see a lot of conversations about objectives that have to do with number of views, ignoring the fact that very often that's a thing that you're literally just paying for. So why is that an objective for a specific piece of work specific insight if it's a thing that you should be able to know in advance that you're going to achieve. Or from a vanity metrics perspective, there's a huge amount of focus on like, "Well, how many people saw this post or how many people like this, or how many people hit this specific page? How many times is this link shared?"
Jon: (21:38)
That doesn't always directly track to some kind of deeper metric that's going to drive a specific result. The best way I can put it is you've heard countless times the phrase you get what you measure. I feel as though our entire digital marketing ecosystem is leaned really heavily into measuring everything because they can and because it's a point of differentiation and not necessarily considering why those things are being measured or what that measurement is supposed to help us do.
Deane: (22:05)
Is that the human gap? I mean, machines measure things. I mean, it's up to humans to put value on things, right?
Jon: (22:11)
Well, I think what's interesting is the human judgment and empathy is baked in at the very core in terms of the purpose of all of these things. The cynical part of my brain says part of the reason there's so much analytics and measurement in anything happening on the web, in anything happening in entertainment now is because it was an easy way to differentiate creation of content and creation of ways of connecting with people from more traditional needs. So the argument early on at the dawn of time for display advertising and for building a webpage was, well if you just hand people a pamphlet, you don't know what they actually looked at, and you don't know how many people looked at the pamphlet. And having worked in the past with direct mail and people are actually sending out flyers, I've seen the tortured math they use to try to provide deep, deep analytics in terms of ... I actually think that three and a half people read a copy of this magazine we give out for free on the subway.
Jon: (23:01)
Sorry, that went super off topic. What I mean to say is, I think a lot of this is baked into the core differentiation of the web as a medium of communication from more traditional mediums. I think there was a real tendency to kind of break that apart and make it something special, especially when it came to the selling of advertising and the kind of siphoning of that advertising dollar away from first traditional media and then from display to really Facebook, Google and Amazon are now the three biggest players in advertising and the three biggest players in what everyone consumes on the web. So I think the human intent of separating people from their money shapes a lot of this and analytics, whether or not it is intended to be such currently was kind of originally invented as a justification that separation of people and money.
Corey: (23:45)
I hear somebody listening to this saying, boy, John, is there anything that we can do that's easy? Because it doesn't sound like any of this stuff's easy. Are there certain times when data can just be taken at face value? Are there examples of just here's something I'm getting and I can actually just use this data the way it is?
Jon: (24:06)
A hundred percent. I mean, things that are super valuable to everyone and often get overlooked is if you have a page that's supposed to accomplish this specific job, asking yourself whether or not it does that job is always valuable. If you have a page that's supposed to educate someone on a product and then move them to a page where you're selling them information about what happens on that page is going to be valuable. You may not necessarily know exactly why it's happening, but you know something needs to change. And that can be the only thing you really need to know sometimes. If you're running an eCommerce site and people are coming to the site and not buying anything that lets there's a couple of things you need to tweak. Either your experience isn't clear enough and direct enough so people can't get where they need to go or your product is super offputting either in reality or in the way it's presented.
Jon: (24:50)
And therefore people aren't driving in, or sometimes it can be as simple as it feels like a bait and switch to the people who might be navigating to a side or clicking an ad. What they're seeing is what they expect. I've actually taken that really simple. The really simple piece of data of people are clicking through to this thing, but they're not engaging with it or buying anything and used that as the starting point of revamping, the entire campaign to ensure every landing page was actually aligned with the creative of the individual piece of advertising that someone saw when they clicked to get there. So you wouldn't see a contextual ad showing something that felt like your situation in your family and then land on a generic page. And you didn't see the association to. You'd land on something with similar messaging, similar framing, similar language. And that all comes from the really simple insight of people are coming here, but they're not doing the thing we want them to do.
Deane: (25:38)
When you asked that question, Corey, I thought, I felt like you were saying, how can we make this list, David Lynch, and more Michael Bay, right? Because say what you want about Michael Bay-
Corey: (25:48)
There are no explosions in this yet.
Deane: (25:50)
Yeah. You know what's coming with Michael Bay, right? That's pretty superficial. David Lynch, you're always wondering what the hell's going on here. Jon, let me ask you, are there any industries or scenarios that are data proof? And I'm going to give you an example. This example I use often because I don't know that I have an answer for this. Let's say you're a company that manufactures and builds interstate highway. So you're a big instruction company, build interstate highway. I'm using this off an actual situation. They didn't build interstate highway, but they built large sale industrial projects like that.
Deane: (26:23)
Now this organization, nobody Googles for interstate highway. Nobody puts a hundred miles of interstate highway on a shopping cart and checks out. Really they don't care how many people come to their website. They just need certain people to come to their website. And the way their projects start are highly relational and highly offline. And the way that they end are highly offline as well. So I mean, fundamentally they want somebody on a purchasing committee to maybe come look at their website and view some of their case studies and six months later when they're voting on a proposal, think, "Yeah, I think that company could do a decent job in that situation. A whole lot of analytics go out the window." What do you do for that company analytics wise? I mean, can data, can analytics help them in any way?
Jon: (27:04)
Well, I think there's always a role in understanding the people you're engaging with one of the big questions I'd ask. If someone came to me and said we don't have a traditional shopping cycle, people don't come and look for a specific product or engage with them that way online. Most of those places when you ask them about recruiting, have a second thought because I imagine if you're doing that kind of complex industrial engineering, you're going to have people coming out of school and you're going to be looking for people that might be a great fit for the kind of work you do. So you can consider how you may want to appeal to that audience. You may want to consider what engagement looks like in terms of people learning more about your company from that perspective. But I will say traditional metrics.
Jon: (27:41)
Aren't going to be super interesting to you at that point. Volume is going to be super important. You're not going to want to track down and retarget people. You'll probably be super interested if you have really long sales cycles in anything you do to get a sense of which specific people are coming and engaging with your content, at which specific times you can get really into the nitty gritty in terms of where people are visiting from what you're looking at. I can imagine someone working at a firm like that in business development, getting super excited when some specific IP addresses show up when they're trying to close a massive, massive deal.
Jon: (28:12)
But very often, if what you're looking for is insight and who you're engaging with or the role of that information. I feel like more people than we admit get limited value from that content. I think, analytics can be an incredibly valuable tool in terms of understanding how a thing is performing, if it's important to you, how that thing is performing. But I also think metrics and data in general can be a big distraction from what's more important to your business. If candidly, a very large portion of the stuff on the internet is about establishing baseline credibility in 2022.
Deane: (28:46)
You have a website because you have to have one, right?
Jon: (28:48)
Completely. It's kind of sketchy if you don't. And if that's the reason you have a website, if that's the reason you're putting content on social media, you don't want to see them like a fly by night business that doesn't take things seriously. I don't think you need to be too, too stressed about analytics. But I would say you'd still want to pop the hood every once in a while and make sure things are actually functioning the way they are, because once you have the thing, the thing needs to work well, you look just as sketchy as if you didn't have it.
Deane: (29:13)
It's funny, you talked about that situation about IP addresses viewing and on long sales cycles. I read an article a couple years ago about a lobbying organization that was trying to get the house of representatives to vote on something. And they started a Facebook campaign and they targeted, they geo targeted a single office building in Washington, DC, and they validated something like 73 different times accesses from that office building in which this committee resided viewed that ad. So they literally geotargeted down to an office building and claimed some statistically significant movement based on that.
Jon: (29:48)
Did you see the episode of John Oliver that happened recently on this subject? I'm sorry. Last Week Tonight is the show.
Deane: (29:56)
Yes.
Jon: (29:56)
But he did a bit about data privacy, about issues with data privacy. And then what he did was run really odd, really specific ads aimed at Congress and then basically threatened to release information about who clicked the ad about Ted Cruz's sex tape, for example. He'd leak that information if they weren't interested in passing laws to approve data privacy. And I think there's an interesting, I don't know some of this is probably just that I'm getting ...
Jon: (30:26)
I am not as young as I was when I started doing this stuff. My life and my priorities are very different than when I was 25 and I was diving into doing kind of marketing and digital strategy and kind of playing in that specific space. But things that I used to find really interesting things that used to seem like superpowers to me, that concept of really focused targeting. I'm starting to question whether the ... not even whether the utility is worth a potential invasion or privacy, but whether the utility is actually there.
Deane: (30:53)
I get really frustrated is when people try to use super secret advanced analytics and personalization as an excuse to not create good content that people want. I mean, you see this all the time. Your problem is your content sucks. It's not that you don't have enough marketing technology. Once you invest some money in a good content marketing platform or a good content marketing team, rather than trying to hack your way around deficiencies in the source material.
Jon: (31:24)
I think that is a fantastic point. I think sometimes the availability of intelligence doesn't mean that you need all of it to solve a problem. I'm sure you've seen this as well, but I've definitely seen people in the past develop incredibly complex models to solve a problem that is as simple as people don't think this is interesting.
Deane: (31:43)
Jon, you were supposed to defend analytics.
Jon: (31:47)
The thing is I see deep, deep value in analytics if it's used properly for what it's for. To me, being able to dig into the data and find only this tiny, tiny group of people seems to be engaging with this content, there's two ways to look at it. There's what can we do to make this content engage to a larger group of people? Because the data has now given us a great question to ask. It's told us that there's a gap between who we want to appeal to and who we're appealing to. What can we do to learn complimentary information that allow us to create more engaging content. And then there's the other approach to it, which is I don't want to do the hard thing. What can I do to pay more of the people who might like this in front of the thing that I made and don't want to redo? To me, the problem is rarely analytics and more often people.
Corey: (32:31)
The story of the web.
Deane: (32:32)
Right back to the human condition.
Jon: (32:34)
If it wasn't for all the people, the internet would be perfect.
Corey: (32:38)
Jon, I got one last question for you. Of all of those shoes behind you, which is the best? What's your favorite?
Jon: (32:44)
This is a tough one. So I'm going to answer both of your questions. I would take these because I actually wore them in my wedding.
Corey: (32:50)
Okay. And what are they? They're a black-
Jon: (32:52)
This is pair of-
Corey: (32:54)
This is podcast friendly.
Jon: (32:54)
... Triple black Air Jordan ones.
Corey: (32:56)
Okay.
Jon: (32:56)
Yeah. Thank you for reminding me that this is a podcast. No one can see me. Triple black Air Jordan ones that I wore mostly because they're a nice matte black leather and they didn't look too ridiculous with the suit. If I had to pick something that was just kind of my favorite one that I show off, I have a pair of the 2016 rerelease of the brand Air Jordan ones, which is arguably the sneaker that started sneakerhead culture. It's like the Michael Jordan sneaker, what they say is he was banned from wearing them in the NBA. That's technically not exactly the story, but that's like [inaudible 00:33:31] myth. And so I kind of love those for what they represent. It's actually very, very apropos for this specific thing. It's a situation where the actual data of what happened may not support the story, but it's enough of a connection that the broader story is deeply culturally relevant and motivates a ton of behavior.
Corey: (33:50)
Jon, that's all I got for you. Is there anything you want to promote? I forgot to ask you if there's anything you want to promote.
Jon: (33:56)
Not particularly. I mean, I have my newsletter that I don't write nearly often enough. I'm very proud of the work we do at Diamond Marketing Group. I think, I think we have a really fantastic team and we make a significant impact for our clients, which is the thing I really respect. We do kind of focus on doing great work and all the different channels where great work is delivered. Some of that's really digitally focused insights driven work. Some of it's deep research work. Some of it's just really fun, entertaining, advertising work. So proud of that team and proud to be a part of it. And I'm also a big fan of Raptor's Rookie of the Year, Scotty Barnes. So I'd like to promote him in any situation I can.
Corey: (34:34)
Yeah. Awesome. All right. That's all we got. Thanks, Jon.
Jon: (34:39)
Thank you very much.
Corey: (34:47)
Okay. We're back. Thanks to Jon Crowley. He is great. We say it every single week, but Jon and I met at Now What conference that we did at Blend for a few years. I don't know. It must be five years ago at times weird, but he is the nicest. He's Canadian, so of course. The jokes about Canadian residents are all absolutely true. He is one of the nicest people I've ever met.
Deane: (35:10)
He is a very, very nice guy. And I wish that the people listening to this podcast could have seen him because behind him is just shelves and shelves of Air Jordans.
Corey: (35:22)
He said something that I thought was fascinating. And that was humans are pattern recognition machines.
Deane: (35:28)
Even when no such pattern exists. You invent patterns to recognize.
Corey: (35:33)
I mean, this probably explains a lot about the types of games we play and the types of things that we look for in life. But I never really thought of us as humans. We always think that we are these inherently intelligent beings that we have enough rational thought to not be swayed by shiny objects. And in reality, we are always proving to ourselves and everyone that we're definitely distracted by shiny objects. And when we see patterns we run after them.
Deane: (36:02)
I've said this in the context of politics all the time. Nobody wants to be right. They just want to feel right. I think we look at analytics. We talk to Jon about this. I mean, I think people look at analytics as this is a chance for me to prove my confirmation bias rather than ... They say the first rule of being a trial lawyer is you never ask a question you don't already know the answer to. This is the thing here. And then what I think is we harp on this so much and it came up again with Jon, but a human psychology. I think we should all go out and get psychology degrees or counseling degrees. That would be the most valuable thing to have in this business. I feel is to understand how human beings work, human manipulation is that terrible learning how to manipulate humans would be the most productive thing we could do.
Corey: (36:47)
Yeah. I mean, it is terrible, but all right, that's our show. We'll end on that one. Right there. Thanks to our guest again, Jon Crowley, vice president strategy at Diamond Marketing. The Web Project Guide is a product of Blend Interactive, a web strategy design and development shop that builds websites and guides teams through complicated web and content problems from content strategy to design, to CMS implementation support. We are dedicated to making great things for the web. And this is one of those things. If you are a physical book person, we've got great news for you. This podcast was born of the spirit of our book, The Web Project Guide. I tell you with all honesty that this is a great book to have a physical copy of, and you can purchase it directly from us at order.webproject.guide. There's also a link to Amazon for any non-US friends. You can pick one up with a much more reasonable shipping through there.
Deane: (37:41)
I just want to stress too, that this is a fine physical specimen of a book.
Corey: (37:45)
It really is.
Deane: (37:46)
Anyway, lovely book. Everybody go back on it.
Corey: (37:48)
This is episode eight of The Web Project Guide, which will also corresponds with chapter eight of that book, gather insight from your metrics. You can read the full text of this chapter at web project.guide/metrics where you'll find the resources we use to write the chapter. Also, check out our other chapters or stay tuned and subscribed for next month. When we talk about content strategy and what you need to know to plan for your web project. Deane loves attention clearly, so you can give him attention by giving us a five star review.
Deane: (38:17)
I will base my self esteem on it.
Corey: (38:18)
Perfect. We would really appreciate that. It helps out because algorithms are weird and we-
Deane: (38:24)
Think of your five star review as an analytic.
Corey: (38:27)
I like to think of a five star review as a friendly, welcome message, but also definitely a point of data that the algorithm's going to use to deliver its own special insights. And with that, we bid the ado. Subscribe and check out next month when we talk content strategy, if you loved this episode, actually go back to the Eric Hall episode where we kind of talk about the stuff that happens before we talk a little bit about research and developing questions. A lot of the same stuff that Jon talked about, but until then go do amazing things.
Deane: (38:59)
Good luck.