Get in touch with our team
Feature image for 23.05.2017

23.05.2017

14 min read

SMX London: What’s New and Cool at Google – Juan Felipe Rincon

This article was updated on: 07.02.2022

SMX London is now in its 10th year, welcoming SEO and PPC professionals from across the world to learn about what’s new in our industry.

Juan Felipe Rincon joined us for the first session, in an interview with Chris Sherman.

What’s exciting you at Google right now?

  • Platform improvements that we’ve announced to AMP, particularly AMP Bind
  • The amount of progress made in enabling really rich experiences through progressive web apps
  • A new series of tutorials that were launched last week to help people understand the basics of web security, to teach people how to build safe and secure websites. It will be called Learn Web Security and be announced further in June

Google’s talked a lot about AMP recently. Can you tell us about that?

AMP is a way of defining web pages that loads very quickly. It addresses a lot of the concerns that people in the ecosystem have said, where pages have got heavier over time as more features and media are added to them. This slows the pages down and damages the user experience.

AMP uses a subset of HTML and some additional optimisations through caching network and caching delivery, as well as a smaller set of interactivity tools that a web developer can use.

It means web experiences that build instantly on a mobile device to drive better experiences. They’ve worked hard to create an open platform that anyone can participate in, so AMP is just a few additional tags, some updated functionality but broadly speaking, it should be easy to implement.

Juan says people using AMP have seen significant increases in user engagement.

Tell us about AMP for advertisers

The actual implementation of the code itself for advertisers is the same as any other application. It’s the same skillset for SEO or PPC.

It speeds up the experience for everybody; it’s not super prescriptive to how pages need to look or interact, but it addresses some of the really heavy elements that can make loading a page really limited.

Google launched 1,563 search changes last year based on 9800 live traffic experiments and 136,000 search quality tests. How do we keep up?

You shouldn’t have to keep up with that. By and large, the main objectives we have are around providing better search results for users and better platforms for content producers.

Our goal is to know it’s improving things. There will be changes but the main consideration for the industry is around the users and how best you service their needs. Google’s job is to bring that great content in front of the right people.

That means you simply shouldn’t have to keep on top of changes. Google tells us about them for transparency, but the focus of our industry needs to be the user.

15% of all daily searches are brand new; how does Google cope with long tail or new queries?

Juan isn’t sure what the 15% covers, but for Google, that’s where a lot of the work goes in to understanding the searcher and their needs.

We’re seeing changing patterns in how people interact with search, so a lot more natural language queries and questions being asked. we know users are asking more complex, detailed, thorough questions; they’re not just asking for one or two keywords, they want more extensive information.

That’s information we’re feeding into our ranking teams and quality assessment teams, as well as the teams working on natural language query processing.

You mentioned our focus should be on the best user experience. Is there something we should be thinking about with this changing behaviours in terms of questions and answers?

People are already adjusting to this space. Juan spoke at a local search event recently and found that a lot of people in our industry are recognising that searchers don’t think in keywords, so a lot of the practices are around how a query engine/search engine looks at statements and questions and how people search.

This does impact how we answer those questions, how we structure our content, how we meet those needs. Think about addressing the user and how they engage with content; the same is now true of how they interact with search.

If you read the guides on assistant actions, there is an API to help you build engaging experiences through assistants.

People search differently through voice; they engage in full sentences. You search differently to typing. Ultimately, voice search is transcribing a query and issuing that query, but if that query is now more informed with the nuances of language, it could be easier to understand the intent of the user and give better responses.

The fact that some capability has been given to devices to be able to process queries within the device shows that the state of our art has really changed and we are much more voice enabled. the actual technology that goes into natural language processing is crazy. Google will be demoing how these things work and people will see how these interactions are happening and how the system is interpreting and really engaging with that information.

There’s a great deal of AI technology; it’s astonishing that we’ve got to that point and what’s possible now opens up whole new opportunities for the future.

The mobile first index has been a hot topic lately. How is it different to what’s been done in the past and what should SEOs be doing going forward?

It’s still something Google is testing and we need to ensure we have the best quality before we can launch it.

The advice for content creators is the same as always; don’t think about what search engines want, think about what the user wants.

People use their mobiles far more than their desktops, so if you don’t have a mobile strategy, you’re way behind. Google will be testing mobile results in coming weeks, but ultimately, they expect the same thing as usual; presenting the best quality, most relevant search result to the user.

It’s more a question of when the mobile index will kick in and how it will affect you if you’re not quite there yet. it’s coming soon, and if you’re still struggling with your mobile strategy, you’re already behind.

You talked about security as a key issue, with estimates that 50% of websites now use HTTPS. How important is it?

Very. The first issue is protection of user data. if someone is engaging with your website in any way, if you’re not encrypting that traffic, that whole engagement is open. You’re having a conversation with the user through their path through your website and that should be private – especially when they start sharing data with you.

Beyond that, enabling that layer of security is one of the easiest things to do to protect your website against attack. It’s like putting your communications in an envelope rather than on a postcard; it’s a basic security hygiene factor.

Chrome has also identified HTTPS to be so important to users that pages which are not secure will be flagged as such. At some point later this year, pages that aren’t using HTTPS for basic data transfer will be flagged as not secure. There’s no good technical or practical reasons not to do it (15 years ago, there might have been performance concerns or some ad networks not enabling – those issues are not issues today).

Will HTTPS affect rankings?

Juan doesn’t know. He knows Google announced a small ranking impact a couple of years ago, but Juan doesn’t know now.

Tell us about Search Console; what are your favourite tools?

Juan has talked with a lot of webmasters over time and there is lots to be said about the utilisation of Search Console.

One thing that Google has done well is to make reporting very granular within Search Console. The data highlighter tool has made structured data markup easier and you can see the impact clearly.

Juan likes that direct correlation between the practice Google is recommending (structured data) and the effects of that work. This also informs Google’s product strategy.

Another feature we don’t discuss much is the structured data markup helper. It’s there to make it more possible for someone who doesn’t have access to the HTML on that level to teach Google what their content is.

Google sent over 9 million messages relating to web spam last year. Tell us more.

The most common web spam issue is websites that have been hacked.

If your website isn’t protected, people who are intent on generating spam are able to take advantage of good sites. The effect it has on a good website owner is detrimental and particularly harsh, because Google has a responsibility to protect the user, and that’s why we tell users when a website might be compromised, and why we tell webmasters that their website might be compromised.

This is another reasons website managers need to be up on security. You don’t have to be experts in SQL attacks to be able to protect your website. Google sees it as their responsibility to help educate users.

the other issue we see is the creation of pages that are only there for spammy reasons, garbage nonsense pages that you can immediately recognise as a junk page. We’re seeing much less of this, but it’s still puzzling to Google and when they teach people how to evaluate spam, it’s difficult to work out what the person who’s created that page was even trying to achieve.

How much impact do the quality raters have on forming policies etc?

The human quality assessors have no impact on forming policies. Those guidelines are produced by a different team, who are there to ensure that search results in general are free of manipulation, free of spam and not harming users. They engage closely with people who do improvements on spam filters etc, to ensure what they learn from doing that is put to good use.

What’s the process when Google spots spam on a site?

Google will immediately flag the website as hacked and try to inform the website owner through whatever means possible, the fastest one being Search Console. That’s why people need to be active on Search Console.

Google informs users that there is a risk. They try to limit the impact to the website owner where possible, so they don’t necessarily remove it from the SERPs, and will retain any pages that aren’t affected, butt heir responsibility is still to protect the user.

If the website owner isn’t on Search Console, they will reach them through Analytics. If they’re not there, it’s more difficult.

Once the website owner is notified, Google recommends the following steps:

  1. Quarantine the site; prevent any further intrusion by either taking it offline or stopping traffic into it. One of the things that happens when a site is compromised, the person who has hacked it will be working against you to undo whatever you fix; the best thing to do is quarantine.
  2. Roll back to a clean version of code and fix the vulnerability (which could be a non-patched server, password, out of date plugin, etc).
  3. Submit a reconsideration through Search Console (Google will do this automatically if possible, but it’s good practice to let them know via SC anyway) – at this point, Google can let you know if there’s anything else to be fixed, and will try to be as helpful as possible.

What’s the difference between an algorithmic penalty and a manual action?

Penalty isn’t a word Juan likes to use. What they talk about are adjustments to the rankings in how they assess quality or adherence to content guidelines. When things move in the SERPs, that’s just the algorithm working, so it’s not a penalty.

When we find that something is getting through that algorithm and isn’t consistent with our guidelines, we take a manual action on it to do what the algorithm would have done to this page has it caught it.

We do always let the webmaster know about this, as a manual action is taken by a human, rather than rules, which is why they will notify the webmaster and give them the chance to make the recommended correction.

We’re not in the business of telling people off, we are working for better search results.

Let’s look forward now; it seems Google is moving toward an AI first approach. What’s the implication of that for users and what should we do?

The area of AI and how it’s applying to search relate to how important it is for Google to fulfil its mission of organising the world’s information and making it accessible and useful means they need to be able to understand that information in the best, most relevant, context driven way.

This means we have to focus on this learning problem. If you wanted to write a piece of code to help something to understand what’s been written on a page in the way a human would, that’s a huge task – and we’re doing that across billions of pages. it’s a huge task.

As marketers, continue to focus on the marketing side of things. Marketing is about making a connection with a person, so you need to drive awareness of your offering or a feeling of an experience they will have if they take up your offering.

That means marketers need to think about the user and what they’re trying to achieve. It’s very similar to what Google is doing, where they are trying to understand users through AI. Marketers should focus on the user, then the technical elements of how to help a search engine understand what you’re doing is much less daunting, because you’re simply doing a few things to make that technically accessible to a search engine that is also investing in its own understanding.

Do you see there being a time where it’s all AI based an no human driven algorithms?

Not for a long time. The data we need to teach our AIs is heavily reliant still on human input. It’s difficult to say when that will change.

How much, if at all, does Google use social signals and social data?

Google’s answer hasn’t changed on that. Social is “very noisy” and it’s not particularly useful because of that.

Does AMP implementation affect mobile rankings?

AMP creates a better experience for users, and Google prefers better experiences. AMP is mobile friendly and gets the benefit of that in the mobile SERPs.

It’s still super important to Google. Think about a user; if they see that this website they trust is linking to another that they trust, I can assume that I should trust it too.

That connection, where a person makes a link to another website, is really valuable. That’s worked well on the academic side for many years and has worked well for rankings for many years. it will continue to do so.

It’s not about the link, it’s about what the link means. It’s that the link says the resource it links to is a relevant, useful resource. You need to make that distinction; once you know that links are worth having, you then need to layer in the information about which links are most relevant.

Google still considers links as very important.

Organically generated links (those put in by people, by someone making a direct endorsement of a brand or source) mean a lot more. Building links for links’ sake is missing the point.

Say we come back here in 5 years; what will we be talking about then?

Juan has been in the tech industry for 20 years. Back in 2002, 2003, he would be talking about how to build applications on flip phones that operated quickly and used the processing capability of the device.

Yes, it was different screens with buttons rather than swipes, but we were talking about how to create great experiences for users within that technology.

That was 15/16 years ago, and those conversations haven’t, and won’t change. We’re still talking about how to create the best experiences, and the technology is progressing.

We are still going to be figuring out how to communicate our messages, and our focus will be on how to use the day’s technologies to achieve that goal. We’ll still be talking about stories, users, differentiation, etc. The technology will change but the topics will be the same.