Category: Blog

Your blog category

  • User Research Is Storytelling

    User Research Is Storytelling

    Ever since I was a boy, I’ve been fascinated with movies. I loved the characters and the excitement—but most of all the stories. I wanted to be an actor. And I believed that I’d get to do the things that Indiana Jones did and go on exciting adventures. I even dreamed up ideas for movies that my friends and I could make and star in. But they never went any further. I did, however, end up working in user experience (UX). Now, I realize that there’s an element of theater to UX—I hadn’t really considered it before, but user research is storytelling. And to get the most out of user research, you need to tell a good story where you bring stakeholders—the product team and decision makers—along and get them interested in learning more.

    Think of your favorite movie. More than likely it follows a three-act structure that’s commonly seen in storytelling: the setup, the conflict, and the resolution. The first act shows what exists today, and it helps you get to know the characters and the challenges and problems that they face. Act two introduces the conflict, where the action is. Here, problems grow or get worse. And the third and final act is the resolution. This is where the issues are resolved and the characters learn and change. I believe that this structure is also a great way to think about user research, and I think that it can be especially helpful in explaining user research to others.

    Use storytelling as a structure to do research

    It’s sad to say, but many have come to see research as being expendable. If budgets or timelines are tight, research tends to be one of the first things to go. Instead of investing in research, some product managers rely on designers or—worse—their own opinion to make the “right” choices for users based on their experience or accepted best practices. That may get teams some of the way, but that approach can so easily miss out on solving users’ real problems. To remain user-centered, this is something we should avoid. User research elevates design. It keeps it on track, pointing to problems and opportunities. Being aware of the issues with your product and reacting to them can help you stay ahead of your competitors.

    In the three-act structure, each act corresponds to a part of the process, and each part is critical to telling the whole story. Let’s look at the different acts and how they align with user research.

    Act one: setup

    The setup is all about understanding the background, and that’s where foundational research comes in. Foundational research (also called generative, discovery, or initial research) helps you understand users and identify their problems. You’re learning about what exists today, the challenges users have, and how the challenges affect them—just like in the movies. To do foundational research, you can conduct contextual inquiries or diary studies (or both!), which can help you start to identify problems as well as opportunities. It doesn’t need to be a huge investment in time or money.

    Erika Hall writes about minimum viable ethnography, which can be as simple as spending 15 minutes with a user and asking them one thing: “‘Walk me through your day yesterday.’ That’s it. Present that one request. Shut up and listen to them for 15 minutes. Do your damndest to keep yourself and your interests out of it. Bam, you’re doing ethnography.” According to Hall, [This] will probably prove quite illuminating. In the highly unlikely case that you didn’t learn anything new or useful, carry on with enhanced confidence in your direction.”  

    This makes total sense to me. And I love that this makes user research so accessible. You don’t need to prepare a lot of documentation; you can just recruit participants and do it! This can yield a wealth of information about your users, and it’ll help you better understand them and what’s going on in their lives. That’s really what act one is all about: understanding where users are coming from. 

    Jared Spool talks about the importance of foundational research and how it should form the bulk of your research. If you can draw from any additional user data that you can get your hands on, such as surveys or analytics, that can supplement what you’ve heard in the foundational studies or even point to areas that need further investigation. Together, all this data paints a clearer picture of the state of things and all its shortcomings. And that’s the beginning of a compelling story. It’s the point in the plot where you realize that the main characters—or the users in this case—are facing challenges that they need to overcome. Like in the movies, this is where you start to build empathy for the characters and root for them to succeed. And hopefully stakeholders are now doing the same. Their sympathy may be with their business, which could be losing money because users can’t complete certain tasks. Or maybe they do empathize with users’ struggles. Either way, act one is your initial hook to get the stakeholders interested and invested.

    Once stakeholders begin to understand the value of foundational research, that can open doors to more opportunities that involve users in the decision-making process. And that can guide product teams toward being more user-centered. This benefits everyone—users, the product, and stakeholders. It’s like winning an Oscar in movie terms—it often leads to your product being well received and successful. And this can be an incentive for stakeholders to repeat this process with other products. Storytelling is the key to this process, and knowing how to tell a good story is the only way to get stakeholders to really care about doing more research. 

    This brings us to act two, where you iteratively evaluate a design or concept to see whether it addresses the issues.

    Act two: conflict

    Act two is all about digging deeper into the problems that you identified in act one. This usually involves directional research, such as usability tests, where you assess a potential solution (such as a design) to see whether it addresses the issues that you found. The issues could include unmet needs or problems with a flow or process that’s tripping users up. Like act two in a movie, more issues will crop up along the way. It’s here that you learn more about the characters as they grow and develop through this act. 

    Usability tests should typically include around five participants according to Jakob Nielsen, who found that that number of users can usually identify most of the problems: “As you add more and more users, you learn less and less because you will keep seeing the same things again and again… After the fifth user, you are wasting your time by observing the same findings repeatedly but not learning much new.” 

    There are parallels with storytelling here too; if you try to tell a story with too many characters, the plot may get lost. Having fewer participants means that each user’s struggles will be more memorable and easier to relay to other stakeholders when talking about the research. This can help convey the issues that need to be addressed while also highlighting the value of doing the research in the first place.

    Researchers have run usability tests in person for decades, but you can also conduct usability tests remotely using tools like Microsoft Teams, Zoom, or other teleconferencing software. This approach has become increasingly popular since the beginning of the pandemic, and it works well. You can think of in-person usability tests like going to a play and remote sessions as more like watching a movie. There are advantages and disadvantages to each. In-person usability research is a much richer experience. Stakeholders can experience the sessions with other stakeholders. You also get real-time reactions—including surprise, agreement, disagreement, and discussions about what they’re seeing. Much like going to a play, where audiences get to take in the stage, the costumes, the lighting, and the actors’ interactions, in-person research lets you see users up close, including their body language, how they interact with the moderator, and how the scene is set up.

    If in-person usability testing is like watching a play—staged and controlled—then conducting usability testing in the field is like immersive theater where any two sessions might be very different from one another. You can take usability testing into the field by creating a replica of the space where users interact with the product and then conduct your research there. Or you can go out to meet users at their location to do your research. With either option, you get to see how things work in context, things come up that wouldn’t have in a lab environment—and conversion can shift in entirely different directions. As researchers, you have less control over how these sessions go, but this can sometimes help you understand users even better. Meeting users where they are can provide clues to the external forces that could be affecting how they use your product. In-person usability tests provide another level of detail that’s often missing from remote usability tests. 

    That’s not to say that the “movies”—remote sessions—aren’t a good option. Remote sessions can reach a wider audience. They allow a lot more stakeholders to be involved in the research and to see what’s going on. And they open the doors to a much wider geographical pool of users. But with any remote session there is the potential of time wasted if participants can’t log in or get their microphone working. 

    The benefit of usability testing, whether remote or in person, is that you get to see real users interact with the designs in real time, and you can ask them questions to understand their thought processes and grasp of the solution. This can help you not only identify problems but also glean why they’re problems in the first place. Furthermore, you can test hypotheses and gauge whether your thinking is correct. By the end of the sessions, you’ll have a much clearer picture of how usable the designs are and whether they work for their intended purposes. Act two is the heart of the story—where the excitement is—but there can be surprises too. This is equally true of usability tests. Often, participants will say unexpected things, which change the way that you look at things—and these twists in the story can move things in new directions. 

    Unfortunately, user research is sometimes seen as expendable. And too often usability testing is the only research process that some stakeholders think that they ever need. In fact, if the designs that you’re evaluating in the usability test aren’t grounded in a solid understanding of your users (foundational research), there’s not much to be gained by doing usability testing in the first place. That’s because you’re narrowing the focus of what you’re getting feedback on, without understanding the users’ needs. As a result, there’s no way of knowing whether the designs might solve a problem that users have. It’s only feedback on a particular design in the context of a usability test.  

    On the other hand, if you only do foundational research, while you might have set out to solve the right problem, you won’t know whether the thing that you’re building will actually solve that. This illustrates the importance of doing both foundational and directional research. 

    In act two, stakeholders will—hopefully—get to watch the story unfold in the user sessions, which creates the conflict and tension in the current design by surfacing their highs and lows. And in turn, this can help motivate stakeholders to address the issues that come up.

    Act three: resolution

    While the first two acts are about understanding the background and the tensions that can propel stakeholders into action, the third part is about resolving the problems from the first two acts. While it’s important to have an audience for the first two acts, it’s crucial that they stick around for the final act. That means the whole product team, including developers, UX practitioners, business analysts, delivery managers, product managers, and any other stakeholders that have a say in the next steps. It allows the whole team to hear users’ feedback together, ask questions, and discuss what’s possible within the project’s constraints. And it lets the UX research and design teams clarify, suggest alternatives, or give more context behind their decisions. So you can get everyone on the same page and get agreement on the way forward.

    This act is mostly told in voiceover with some audience participation. The researcher is the narrator, who paints a picture of the issues and what the future of the product could look like given the things that the team has learned. They give the stakeholders their recommendations and their guidance on creating this vision.

    Nancy Duarte in the Harvard Business Review offers an approach to structuring presentations that follow a persuasive story. “The most effective presenters use the same techniques as great storytellers: By reminding people of the status quo and then revealing the path to a better way, they set up a conflict that needs to be resolved,” writes Duarte. “That tension helps them persuade the audience to adopt a new mindset or behave differently.”

    This type of structure aligns well with research results, and particularly results from usability tests. It provides evidence for “what is”—the problems that you’ve identified. And “what could be”—your recommendations on how to address them. And so on and so forth.

    You can reinforce your recommendations with examples of things that competitors are doing that could address these issues or with examples where competitors are gaining an edge. Or they can be visual, like quick mockups of how a new design could look that solves a problem. These can help generate conversation and momentum. And this continues until the end of the session when you’ve wrapped everything up in the conclusion by summarizing the main issues and suggesting a way forward. This is the part where you reiterate the main themes or problems and what they mean for the product—the denouement of the story. This stage gives stakeholders the next steps and hopefully the momentum to take those steps!

    While we are nearly at the end of this story, let’s reflect on the idea that user research is storytelling. All the elements of a good story are there in the three-act structure of user research: 

    • Act one: You meet the protagonists (the users) and the antagonists (the problems affecting users). This is the beginning of the plot. In act one, researchers might use methods including contextual inquiry, ethnography, diary studies, surveys, and analytics. The output of these methods can include personas, empathy maps, user journeys, and analytics dashboards.
    • Act two: Next, there’s character development. There’s conflict and tension as the protagonists encounter problems and challenges, which they must overcome. In act two, researchers might use methods including usability testing, competitive benchmarking, and heuristics evaluation. The output of these can include usability findings reports, UX strategy documents, usability guidelines, and best practices.
    • Act three: The protagonists triumph and you see what a better future looks like. In act three, researchers may use methods including presentation decks, storytelling, and digital media. The output of these can be: presentation decks, video clips, audio clips, and pictures. 

    The researcher has multiple roles: they’re the storyteller, the director, and the producer. The participants have a small role, but they are significant characters (in the research). And the stakeholders are the audience. But the most important thing is to get the story right and to use storytelling to tell users’ stories through research. By the end, the stakeholders should walk away with a purpose and an eagerness to resolve the product’s ills. 

    So the next time that you’re planning research with clients or you’re speaking to stakeholders about research that you’ve done, think about how you can weave in some storytelling. Ultimately, user research is a win-win for everyone, and you just need to get stakeholders interested in how the story ends.

  • To Ignite a Personalization Practice, Run this Prepersonalization Workshop

    To Ignite a Personalization Practice, Run this Prepersonalization Workshop

    This is in the photo. You’ve joined a club at your business that’s designing innovative product features with an focus on technology or AI. Or perhaps your business only started using a personalization website. Either way, you’re designing with statistics. What then? When it comes to designing for personalization, there are many warning stories, no immediately achievement, and some guidelines for the baffled.

    The personalization space is true, between the dream of getting it right and the worry of it going wrong ( like when we encounter “persofails” similar to a company’s repeated pleas for more toilet seats from regular people ). It’s an particularly confusing place to be a modern professional without a map, a map, or a strategy.

    There are no Lonely Planet and some tour guides for those of you who want to personalize because powerful customisation is so dependent on each group’s talent, technology, and market position.

    But you can ensure that your group has packed its bags rationally.

    There’s a DIY method to increase your chances for achievement. You’ll at least at least disarm your boss ‘ irrational exuberance. Before the group you’ll need to properly plan.

    We refer to it as prepersonalization.

    Behind the audio

    Take into account Spotify’s DJ element, which debuted this year.

    We’re used to seeing the polished final outcome of a personalization have. A personal have had to be developed, budgeted, and given priority before the year-end prize, the making-of-backstory, or the behind-the-scenes success chest. Before any customisation have goes live in your product or service, it lives amid a delay of valuable ideas for expressing consumer experiences more automatically.

    How do you decide where to position personalisation wagers? How do you design regular interactions that didn’t journey up users or—worse—breed mistrust? We’ve discovered that several budgeted programs second required one or more workshops to join key stakeholders and domestic customers of the technology in order to justify their continuing investments. Make it count.

    We’ve witnessed the same evolution up near with our clients, from big tech to burgeoning companies. In our experience with working on small and large personalization work, a program’s best monitor record—and its capacity to weather tough questions, work steadily toward shared answers, and manage its design and engineering efforts—turns on how successfully these prepersonalization activities play out.

    Effective seminars consistently distinguish successful future endeavors from ineffective ones, saving many hours of time, resources, and overall well-being in the process.

    A personalization training involves a protracted work of testing and function development. Your technical load is not experiencing a switch-flip. It’s ideal managed as a queue that usually evolves through three methods:

    1. customer experience optimization ( CXO, also known as A/B testing or experimentation )
    2. always-on automations ( whether rules-based or machine-generated )
    3. mature features or standalone product development ( such as Spotify’s DJ experience )

    This is why we created our democratic personalization platform and why we’re field-testing an following deck of cards: we believe that there’s a foundation grammar, a set of “nouns and verbs” that your organization can use to style experiences that are customized, personalized, or automated. These cards won’t be necessary for you. But we strongly recommend that you create something similar, whether that might be digital or physical.

    Set the timer for your kitchen.

    How long does it take to cook up a prepersonalization workshop? The evaluation activities that we suggest include can last for a number of weeks ( and frequently do ). For the core workshop, we recommend aiming for two to three days. Details on the essential first-day activities are included in a summary of our broad approach.

    The full arc of the wider workshop is threefold:

      Kickstart: This specifies the terms of engagement as you concentrate on the potential, the readiness and drive of your team, and your leadership.
    1. Plan your work: This is the heart of the card-based workshop activities where you specify a plan of attack and the scope of work.
    2. Work your plan: This stage consists of making it possible for team members to individually pitch their own pilots that each include a proof-of-concept project, business case, and operating model.

    Give yourself at least a day, split into two large time blocks, to power through a concentrated version of those first two phases.

    Kickstart: Apt your appetite

    We call the first lesson the “landscape of connected experience“. It looks at the possibilities for personalization in your organization. A connected experience, in our parlance, is any UX requiring the orchestration of multiple systems of record on the backend. A marketing-automation platform and a content-management system could be used together. It could be a digital-asset manager combined with a customer-data platform.

    Create a conversation by mentioning consumer and business-to-business examples of connected experience interactions that you admire, find familiar, or even dislike. This should cover a representative range of personalization patterns, including automated app-based interactions ( such as onboarding sequences or wizards ), notifications, and recommenders. These are in the cards, which we have a catalog of. Here’s a list of 142 different interactions to jog your thinking.

    It’s all about setting the tone. What are the possible paths for the practice in your organization? Here’s a long-form primer and a strategic framework for a broader perspective.

    Assess each example that you discuss for its complexity and the level of effort that you estimate that it would take for your team to deliver that feature ( or something similar ). In our cards, we break down connected experiences into five categories: functions, features, experiences, complete products, and portfolios. Size your own build here. This will help to draw attention to the benefits of ongoing investment as well as the difference between what you currently offer and what you intend to offer in the future.

    Next, have your team plot each idea on the following 2×2 grid, which lays out the four enduring arguments for a personalized experience. This is crucial because it emphasizes how personalization can affect your own methods of working as well as your external customers. It’s also a reminder ( which is why we used the word argument earlier ) of the broader effort beyond these tactical interventions.

    Each team member should decide where they would like to place your company’s emphasis on your product or service. Naturally, you can’t prioritize all of them. Here, the goal is to show how various departments may view their own benefits from the effort, which can vary from one department to the next. Documenting your desired outcomes lets you know how the team internally aligns across representatives from different departments or functional areas.

    The third and final KickStart activity is about filling in the personalization gap. Is your customer journey well documented? Will ensuring data and privacy is a major challenge too much? Do you have content metadata needs that you have to address? It’s just a matter of acknowledging the magnitude of that need and finding a solution ( we’re fairly certain that you do ). In our cards, we’ve noted a number of program risks, including common team dispositions. For instance, our Detractor card lists six intractable behaviors that prevent progress.

    Effectively collaborating and managing expectations is critical to your success. Consider the potential obstacles to your advancement in the future. Press the participants to name specific steps to overcome or mitigate those barriers in your organization. According to research, personalization initiatives face a number of common obstacles.

    At this point, you’ve hopefully discussed sample interactions, emphasized a key area of benefit, and flagged key gaps? Good, you’re all set to go on.

    Hit that test kitchen

    What will you need next to bring your personalized recipes to life. Personalization engines, which are robust software suites for automating and expressing dynamic content, can intimidate new customers. They give you a variety of options for how your organization can conduct its activities because of their broad and potent capabilities. This presents the question: Where do you begin when you’re configuring a connected experience?

    The key here is to avoid treating the installed software like some imagined kitchen from a fantasy remodeling project ( as one of our client executives humorously put it ). These software engines are more like test kitchens where your team can begin devising, tasting, and refining the snacks and meals that will become a part of your personalization program’s regularly evolving menu.

    Over the course of the workshop, the final menu of the prioritized backlog will be created. And creating “dishes” is the way that you’ll have individual team stakeholders construct personalized interactions that serve their needs or the needs of others.

    Recipes have ingredients in them, and those recipes have ingredients.

    Verify your ingredients

    Like a good product manager, you’ll make sure you have everything ready to cook up your desired interaction ( or figure out what needs to be added to your pantry ) and that you validate with the right stakeholders present. These ingredients include the audience that you’re targeting, content and design elements, the context for the interaction, and your measure for how it’ll come together.

    This is not just about identifying needs. Documenting your personalizations as a series of if-then statements lets the team:

    1. compare findings to a common method for developing features, similar to how artists paint with the same color palette,
    2. specify a consistent set of interactions that users find uniform or familiar,
    3. and establish parity between all important performance indicators and performance metrics.

    This helps you streamline your designs and your technical efforts while you deliver a shared palette of core motifs of your personalized or automated experience.

    Create your recipe.

    What ingredients are important to you? Consider a who-what-when-why construct:

    • Who are your key audience segments or groups?
    • What content, what design elements, and under what circumstances will you give them?
    • And for which business and user benefits?

    Five years ago, we created these cards and card categories. We regularly play-test their fit with conference audiences and clients. And there are still fresh possibilities. But they all follow an underlying who-what-when-why logic.

    In the cards in the accompanying photo below, you can typically follow along with right to left in three examples of subscription-based reading apps.

    1. Nurture personalization: When a guest or an unknown visitor interacts with a product title, a banner or alert bar appears that makes it easier for them to encounter a related title they may want to read, saving them time.
    2. Welcome automation: An email is sent when a newly registered user is a subscriber and is able to highlight the breadth of the content catalog.
    3. Winback automation: Before their subscription lapses or after a recent failed renewal, a user is sent an email that gives them a promotional offer to suggest that they reconsider renewing or to remind them to renew.

    We’ve also found that sometimes this process comes together more effectively by cocreating the recipes themselves, so a good preworkshop activity might be to think about what these cards might be for your organization. Start with a set of blank cards, and begin labeling and grouping them through the design process, eventually distilling them to a refined subset of highly useful candidate cards.

    The workshop’s later stages, which shift from focusing on cookbooks to focusing on customers, might seem more nuanced. Individual” cooks” will pitch their recipes to the team, using a common jobs-to-be-done format so that measurability and results are baked in, and from there, the resulting collection will be prioritized for finished design and delivery to production.

    Better architecture is necessary for better kitchens.

    Simplifying a customer experience is a complicated effort for those who are inside delivering it. Avoid those who make up their mind. With that being said,” Complicated problems can be hard to solve, but they are addressable with rules and recipes“.

    A team overfitting: they aren’t designing with their best data, is what causes personalization to become a laugh line. Like a sparse pantry, every organization has metadata debt to go along with its technical debt, and this creates a drag on personalization effectiveness. For instance, your AI’s output quality is in fact impacted by your IA. Spotify’s poster-child prowess today was unfathomable before they acquired a seemingly modest metadata startup that now powers its underlying information architecture.

    You can’t stand the heat, unquestionably…

    Personalization technology opens a doorway into a confounding ocean of possible designs. Only a disciplined and highly collaborative approach will produce the necessary concentration and intention for success. So banish the dream kitchen. Instead, head to the test kitchen to burn off the fantastical ideas that the doers in your organization have in store for time, to preserve job satisfaction and security, and to avoid unnecessary distractions. There are meals to serve and mouths to feed.

    You have a better chance of lasting success and sound beginnings with this workshop framework. Wiring up your information layer isn’t an overnight affair. However, if you use the same cookbook and the same recipe combination, you’ll have solid ground for success. We designed these activities to make your organization’s needs concrete and clear, long before the hazards pile up.

    Although there are costs associated with purchasing this type of technology and product design, time well spent on sizing up and confronting your unique situation and digital skills. Don’t squander it. The pudding is the proof, as they say.

  • The Wax and the Wane of the Web

    The Wax and the Wane of the Web

    When you begin to believe you have all figured out, everyone does change, in my opinion. Simply as you start to get the hang of injections, diapers, and ordinary sleep, it’s time for solid foods, potty training, and nighttime sleep. When those are determined, school and occasional sleeps are in order. The cycle goes on and on.

    The same holds true for those of us who are currently employed in design and development. Having worked on the web for about three years at this point, I’ve seen the typical wax and wane of concepts, strategies, and systems. Every day we as developers and designers re-enter a routine pattern, a brand-new concept or technology emerges to shake things up and completely alter our world.

    How we got below

    I built my first website in the mid-’90s. Design and development on the web back then was a free-for-all, with few established norms. For any layout aside from a single column, we used table elements, often with empty cells containing a single pixel spacer GIF to add empty space. We styled text with numerous font tags, nesting the tags every time we wanted to vary the font style. And we had only three or four typefaces to choose from: Arial, Courier, or Times New Roman. When Verdana and Georgia came out in 1996, we rejoiced because our options had nearly doubled. The only safe colors to choose from were the 216 “web safe” colors known to work across platforms. The few interactive elements (like contact forms, guest books, and counters) were mostly powered by CGI scripts (predominantly written in Perl at the time). Achieving any kind of unique look involved a pile of hacks all the way down. Interaction was often limited to specific pages in a site.

    The development of internet requirements

    At the turn of the century, a new cycle started. Crufty code littered with table layouts and font tags waned, and a push for web standards waxed. Newer technologies like CSS got more widespread adoption by browsers makers, developers, and designers. This shift toward standards didn’t happen accidentally or overnight. It took active engagement between the W3C and browser vendors and heavy evangelism from folks like the Web Standards Project to build standards. A List Apart and books like Designing with Web Standards by Jeffrey Zeldman played key roles in teaching developers and designers why standards are important, how to implement them, and how to sell them to their organizations. And approaches like progressive enhancement introduced the idea that content should be available for all browsers—with additional enhancements available for more advanced browsers. Meanwhile, sites like the CSS Zen Garden showcased just how powerful and versatile CSS can be when combined with a solid semantic HTML structure.

    Server-side language like PHP, Java, and.NET took Perl as the primary back-end computers, and the cgi-bin was tossed in the garbage bin. With these improved server-side software, the first period of internet programs started with content-management techniques (especially those used in blogs like Blogger, Grey Matter, Movable Type, and WordPress ) In the mid-2000s, AJAX opened gates for sequential interaction between the front end and back close. Pages had now revise their content without having to reload it. A grain of Script frameworks like Prototype, YUI, and ruby arose to aid developers develop more credible client-side conversation across browsers that had wildly varying levels of standards support. Techniques like photo replacement enable skilled manufacturers and developers to show fonts of their choosing. And technology like Flash made it possible to include movies, sports, and even more engagement.

    These new methods, requirements, and systems greatly reenergized the sector. Web style flourished as creators and designers explored more different styles and designs. However, we also relied heavily on tricks. Early CSS was a huge improvement over table-based layouts when it came to basic layout and text styling, but its limitations at the time meant that designers and developers still relied heavily on images for complex shapes ( such as rounded or angled corners ) and tiled backgrounds for the appearance of full-length columns (among other hacks ). All kinds of nested floats or absolute positioning ( or both ) were necessary for complicated layouts. Display and photo substitute for specialty styles was a great start toward varying the designs from the big five, but both tricks introduced convenience and efficiency issues. Additionally, JavaScript libraries made it simple for anyone to add a dash of interaction to pages, even at the expense of double or even quadrupling the download size of basic websites.

    The web as software platform

    The balance between the front end and the back end continued to improve, leading to the development of the current web application era. Between expanded server-side programming languages ( which kept growing to include Ruby, Python, Go, and others ) and newer front-end tools like React, Vue, and Angular, we could build fully capable software on the web. Along with these tools, there were additional options, such as shared package libraries, build automation, and collaborative version control. What was once primarily an environment for linked documents became a realm of infinite possibilities.

    Mobile devices also increased in their capabilities, and they gave us access to internet in our pockets at the same time. Mobile apps and responsive design opened up opportunities for new interactions anywhere and any time.

    The development of social media and other centralized tools for people to connect and use resulted from this combination of potent mobile devices and potent development tools. As it became easier and more common to connect with others directly on Twitter, Facebook, and even Slack, the desire for hosted personal sites waned. Social media provided connections on a global scale, with both the positive and negative effects.

    Want a much more extensive history of how we got here, with some other takes on ways that we can improve? ” Of Time and the Web” was written by Jeremy Keith. Or check out the” Web Design History Timeline” at the Web Design Museum. A fun tour of” Internet Artifacts” is also available from Neal Agarwal.

    Where we are now

    It seems like we’ve reached yet another significant turning point in the last couple of years. As social-media platforms fracture and wane, there’s been a growing interest in owning our own content again. There are many different ways to create a website, from the tried-and-true classic of hosting plain HTML files to static site generators to content management systems of all varieties. The fracturing of social media also comes with a cost: we lose crucial infrastructure for discovery and connection. The IndieWeb‘s Webmentions, RSS, ActivityPub, and other tools can assist with this, but they’re still largely underdeveloped and difficult to use for the less geeky. We can build amazing personal websites and add to them regularly, but without discovery and connection, it can sometimes feel like we may as well be shouting into the void.

    Especially with efforts like Interop, browser support for CSS, JavaScript, and other standards like web components has increased. New technologies gain support across the board in a fraction of the time that they used to. I frequently find out about a new feature and check its browser support only to discover that its coverage has already exceeded 80 %. Nowadays, the barrier to using newer techniques often isn’t browser support but simply the limits of how quickly designers and developers can learn what’s available and how to adopt it.

    We can now prototype almost any idea with just a few commands and a few lines of code. All the tools that we now have available make it easier than ever to start something new. However, the upfront cost these frameworks may save in initial delivery eventually comes down as the maintenance and upgrading they become a part of our technical debt.

    If we rely on third-party frameworks, adopting new standards can sometimes take longer since we may have to wait for those frameworks to adopt those standards. These frameworks, which previously made it easier to adopt new techniques sooner, have since evolved into obstacles. These same frameworks often come with performance costs too, forcing users to wait for scripts to load before they can read or interact with pages. And when scripts fail ( whether due to poor code, network issues, or other environmental factors ), users frequently have no choice but to use blank or broken pages.

    Where do we go from here?

    Hacks of today help to shape standards for the future. And there’s nothing inherently wrong with embracing hacks —for now—to move the present forward. Problems only arise when we refuse to acknowledge that they are hacks or when we refuse to take their place. So what can we do to create the future we want for the web?

    Build for the long haul. Optimize for performance, for accessibility, and for the user. weigh the costs associated with those user-friendly tools. They may make your job a little easier today, but how do they affect everything else? What is the cost to the users? To future developers? to the adoption of standards? Sometimes the convenience may be worth it. Sometimes it’s just a hack that you’ve gotten used to. And sometimes it’s holding you back from even better options.

    Start with standards. Standards continue to evolve over time, but browsers have done a remarkably good job of continuing to support older standards. The same isn’t always the case with third-party frameworks. Sites built with even the hackiest of HTML from the’ 90s still work just fine today. Even after a few years, the same can’t be said about websites created with frameworks.

    Design with care. Consider the effects of each choice, whether your craft is code, pixels, or processes. The convenience of many a modern tool comes at the cost of not always understanding the underlying decisions that have led to its design and not always considering the impact that those decisions can have. Use the time saved by modern tools to think more carefully and make decisions with care rather than rushing to “move fast and break things”

    Always be learning. If you constantly learn, you also develop. Sometimes it may be hard to pinpoint what’s worth learning and what’s just today’s hack. Even if you were to concentrate solely on learning standards, you might end up focusing on something that won’t matter next year. ( Remember XHTML? ) However, ongoing learning opens up new neural connections in your brain, and the techniques you learn in one day may be used to inform different experiments in the future.

    Play, experiment, and be weird! The ultimate experiment is this web that we’ve created. It’s the single largest human endeavor in history, and yet each of us can create our own pocket within it. Be brave and try something new. Build a playground for ideas. In your own bizarre science lab, perform bizarre experiments. Start your own small business. There is no better place for being more creative, risk-taking, and expressing our creativity.

    Share and amplify. Share what you think has worked for you as you experiment, play, and learn. Write on your own website, post on whichever social media site you prefer, or shout it from a TikTok. Write something for A List Apart! But take the time to amplify others too: find new voices, learn from them, and share what they’ve taught you.

    Make a move and make it happen.

    As designers and developers for the web ( and beyond ), we’re responsible for building the future every day, whether that may take the shape of personal websites, social media tools used by billions, or anything in between. Let’s give everything we produce a positive vibe by infusing our values into everything we do. Create that thing that only you are uniquely qualified to make. Then share it, improve it, re-use it, or create something new. Learn. Make. Share. Grow. Rinse and repeat. Everything will change whenever you believe you have mastered the web.

  • Opportunities for AI in Accessibility

    Opportunities for AI in Accessibility

    I was completely moved by Joe Dolson’s subsequent article on the crossroads of AI and convenience, both in terms of the suspicion he has regarding AI in general and how many people have been using it. In fact, I’m very skeptical of AI myself, despite my role at Microsoft as an accessibility technology strategist who helps manage the AI for Accessibility award program. As with any device, AI can be used in very positive, equitable, and visible ways, as well as in destructive, unique, and harmful ways. And there are a lot of uses for the poor midsection as well.

    I’d like you to consider this a “yes … and” piece to complement Joe’s post. I’m just trying to contradict what he’s saying, but I’m just trying to give some context to initiatives and opportunities where AI can make a difference for people with disabilities. To be clear, I’m not saying that there aren’t real challenges or pressing problems with AI that need to be addressed; there are, and we’ve needed to address them, like, yesterday; instead, I want to take a moment to talk about what’s possible so that we can get it one day.

    Other words

    Joe’s article spends a lot of time examining how computer vision versions can create other words. He raises a number of legitimate points about the state of affairs right now. And while computer-vision concepts continue to improve in the quality and complexity of information in their information, their benefits aren’t wonderful. He argues to be accurate that the state of image research is currently very poor, especially for some graphic types, in large part due to the lack of context-based analysis that exists in the AI systems ( which is a result of having separate “foundation” models for text analysis and image analysis ). Today’s models aren’t trained to distinguish between images that are contextually relevant ( should probably have descriptions ) and those that are purely decorative ( couldn’t possibly need a description ) either. However, I still think there’s possible in this area.

    As Joe points out, alt text publishing via human-in-the-loop should be a given. And if AI can intervene to provide a starting place for alt text, even if the swift may say What is this BS? That’s certainly correct at all … Let me try to offer a starting point— I think that’s a win.

    If we can specifically station a design to examine image usage in context, this may help us more quickly determine which images are likely to be elegant and which ones are likely to be descriptive. That will clarify which situations require image descriptions, and it will increase authors ‘ effectiveness in making their sites more visible.

    While complex images—like graphs and charts—are challenging to describe in any sort of succinct way ( even for humans ), the image example shared in the GPT4 announcement points to an interesting opportunity as well. Let’s say you came across a map that was simply the description of the chart’s name and the type of representation it was: Pie graph comparing smartphone usage to have phone usage in US households earning under$ 30, 000 annually. ( That would be a pretty bad alt text for a chart because it frequently leaves many unanswered questions about the data, but let’s just assume that was the description in place. ) If your website knew that that picture was a pie graph ( because an ship model concluded this ), imagine a world where people could ask questions like these about the creative:

    • Are there more smartphone users than have phones?
    • How many more are there?
    • Is there a group of people that don’t fall into either of these pots?
    • How many people are that?

    For a moment, the chance to learn more about graphics and data in this way could be innovative for people who are blind and low vision as well as for those with different types of color blindness, cognitive impairments, and other issues. Putting aside the challenges of large language model ( LLM) hallucinations. It could also be useful in educational contexts to help people who can see these charts, as is, to understand the data in the charts.

    What if you could ask your browser to make a complicated chart simpler? What if you asked it to separate a single line from a line graph? What if you could ask your browser to transpose the colors of the different lines to work better for form of color blindness you have? What if you asked it to switch colors in favor of patterns? That seems like a possibility given the chat-based interfaces and our current ability to manipulate images in modern AI tools.

    Now imagine a purpose-built model that could extract the information from that chart and convert it to another format. Perhaps it could convert that pie chart (or, better yet, a series of pie charts ) into more usable ( and useful ) formats, like spreadsheets, for instance. That would be incredible!

    Matching algorithms

    When Safiya Umoja Noble chose to call her book Algorithms of Oppression, she hit the nail on the head. Although her book focused on how search engines can foster racism, I believe it’s equally true that all computer models have the potential to foster conflict, prejudice, and intolerance. Whether it’s Twitter always showing you the latest tweet from a bored billionaire, YouTube sending us into a Q-hole, or Instagram warping our ideas of what natural bodies look like, we know that poorly authored and maintained algorithms are incredibly harmful. Many of these are the result of a lack of diversity in the people who create and build them. There is still a lot of potential for algorithm development when these platforms are built with inclusive features in mind.

    Take Mentra, for example. They serve as a network of employment for people who are neurodivers. Based on more than 75 data points, they match job seekers with potential employers using an algorithm. On the job-seeker side of things, it considers each candidate’s strengths, their necessary and preferred workplace accommodations, environmental sensitivities, and so on. On the employer side, it takes into account each work environment, communication issues relating to each job, and other factors. Mentra made the decision to change the script when it came to traditional employment websites because it was run by neurodivergent people. They use their algorithm to propose available candidates to companies, who can then connect with job seekers that they are interested in, reducing the emotional and physical labor on the job-seeker side of things.

    When more people with disabilities are involved in the development of algorithms, this can lower the likelihood that these algorithms will harm their communities. That’s why diverse teams are so crucial.

    Imagine that a social media company’s recommendation engine was tuned to analyze who you’re following and if it was tuned to prioritize follow recommendations for people who talked about similar things but who were different in some key ways from your existing sphere of influence. For instance, if you follow a group of white men who are not white or aren’t white and who also discuss AI, it might be wise to follow those who are also disabled or who are not white. If you followed its advice, you might be able to understand what is happening in the AI field more fully and nuancedly. These same systems should also use their understanding of biases about particular communities—including, for instance, the disability community—to make sure that they aren’t recommending any of their users follow accounts that perpetuate biases against (or, worse, spewing hate toward ) those groups.

    Other ways that AI can assist people with disabilities

    I’m sure I could go on and on about using AI to assist people with disabilities, but I’m going to make this last section into a bit of a lightning round. In no particular order:

      preservation of voice You may be aware of the voice-prescribing options from Microsoft, Acapela, or others, or you may have seen the announcement for VALL-E or Apple’s Global Accessibility Awareness Day. It’s possible to train an AI model to replicate your voice, which can be a tremendous boon for people who have ALS ( Lou Gehrig’s disease ) or motor-neuron disease or other medical conditions that can lead to an inability to talk. This technology can also be used to create audio deepfakes, so it’s something we need to approach responsibly, but the technology has truly transformative potential.
    • voice recognition is. Researchers like those in the Speech Accessibility Project are paying people with disabilities for their help in collecting recordings of people with atypical speech. As I type, they are currently hiring people with Parkinson’s and related conditions, and they intend to expand this list as the project develops. More people with disabilities will be able to use voice assistants, dictation software, and voice-response services as a result of this research, which will lead to more inclusive data sets that enable them to use their computers and other devices more effectively and with just their voices.
    • Text transformation. The most recent generation of LLMs is quite capable of changing existing text without giving off hallucinations. This is incredibly empowering for those who have cognitive disabilities and who may benefit from text summaries or simplified versions, or even text that has been prepared for Bionic Reading.

    The importance of diverse teams and data

    Our differences must be acknowledged as important. The intersections of the identities that we exist in have an impact on our lived experiences. These lived experiences—with all their complexities ( and joys and pain ) —are valuable inputs to the software, services, and societies that we shape. The data we use to train new models must be based on our differences, and those who provide it to us need to be compensated for doing so. More robust models are produced by inclusive data sets, which promote more justifiable outcomes.

    Want a model that doesn’t demean or patronize or objectify people with disabilities? Make sure that the training data includes information about disabilities written by people with a range of disabilities.

    Want a non-binary language model? You may be able to use existing data sets to build a filter that can intercept and remediate ableist language before it reaches readers. Despite this, AI models won’t soon replace human copy editors when it comes to sensitivity reading.

    Want a coding copilot who can provide you with useful recommendations after the jump? Train it on code that you know to be accessible.


    I have no doubts about how dangerous AI will be for people today, tomorrow, and for the rest of the world. However, I think we should also acknowledge this and make thoughtful, thoughtful, and intentional changes to our approaches to AI that will also reduce harm over time with an emphasis on accessibility ( and, in general, inclusion ). Today, tomorrow, and well into the future.


    Thanks to Kartik Sawhney for assisting me with writing this article, Ashley Bischoff for her invaluable editorial assistance, and of course Joe Dolson for the prompt.

  • I am a creative.

    I am a creative.

    I have a creative side. What I do is alchemy. It’s a puzzle. I don’t perform it as much as I let it be done by me.

    I have a creative side. Certainly all creative people approve of this brand. Not all people see themselves in this manner. Some innovative individuals incorporate technology into their work. That is the way they are, and I take that into account. Perhaps I also have a little bit of fear for them. However, my thinking and being are unique.

    It distracts you to apologize and qualify in progress. That’s what my head does to destroy me. I put it off for the moment. I may forgive and be qualified at any time. After I’ve said what I originally said. which is sufficient.

    Except when it is simple and flows like a beverage valley.

    Sometimes it does go that approach. Maybe what I need to make arrives right away. I’ve learned to avoid saying it right away because they think you don’t work hard enough when you realize that sometimes the plan just comes along and it is the best plan and you know it is the best idea.

    Sometimes I just work until the plan strikes me. It occasionally arrives right away, but I don’t remind people for three weeks. Sometimes I blurt out the plan so quickly that I didn’t stop myself. like a child who discovered a reward in a box of Cracker Jacks. I occasionally manage to escape this. Yes, that is the best idea, but maybe others disagree. They don’t usually, and I regret losing my joy.

    Joy should be saved for the meeting, where it will matter. not the informal gathering that two different gatherings precede that meeting. Nothing understands why we hold these gatherings. We keep saying we’re getting rid of them, but we keep discovering new ways to get them. They occasionally yet are good. Sometimes they detract from the real function, though. Depending on what you do and where you do it, the ratio between when conferences are valuable and when they are a sad distraction vary. And who you are and how you go about doing it. Suddenly, I digress. I have a creative side. That is the topic.

    Often, a lot of diligent and persistent work ends up with something that is rarely useful. Maybe I have to take that and move on to the next task.

    Don’t inquire about the procedure. I have a creative side.

    I have a creative side. I have no power over my goals. And I have no power over my best tips.

    I can nail ahead, fill in the blanks, or use images or information, which occasionally works. I can go for a move, which occasionally works. There is a Eureka that has nothing to do with sizzling fuel and flowing pots. I may be making dinner. I frequently have a sense of direction when I awaken. The idea that may have saved me disappears almost as frequently as I become aware and part of the world once more in a senseless wind of oblivion. For imagination, in my opinion, comes from that other planet. The one that we enter in ambitions and, possibly, before and after dying. I’m not a writer, so that’s up to authors to think about. I have a creative side. Theologians should circulate mass armies throughout their artistic globe, which they claim to be true. That is yet another diversion, though. And one that is sad. Possibly on a much bigger issue than whether or not I am creative. But this is still a departure from what I said when I came below.

    Often the result is mitigation. also suffering. You are familiar with the adage” the tortured musician”? Even when the artist ( this place that noun in quotes ) attempts to write a sweet drink jingle, a call in a worn-out comedy, or a budget ask, it’s true.

    Some individuals who detest the idea of being called artistic perhaps been closeted artists, but that’s between them and their gods. No offence here, that’s meant. Your assertions are also accurate. However, mine is for me.

    Artists acknowledge their work.

    Disadvantages know cons, just like real rappers recognize true rappers, just like queers recognize queers. Designers are highly revered by people in the world. We respect, follow, and almost deify the excellent ones. Of course, it is horrible to revere any person. We’ve been given a warning. We are more knowledgeable. We are aware that people are really people. Because they are clay, like us, they squabble, they are unhappy, they regret making the most important decisions, they are weak and hungry, they can be cruel, and they can be as ridiculous as we can. But. But. However, they produce this incredible issue. They give birth to something that was unable to arise before them or otherwise. They are thought’s founders. And since it’s only lying there, I suppose I should add that they are the inventor’s parents. Ba ho backside! Okay, that’s all said and done. Continue.

    Because we compare our personal small accomplishments to those of the great ones, artists denigrate them. Wonderful video I‘m not Miyazaki, though. Greatness is then that. That is brilliance directly from God’s heart. This unsatisfied small thing I created? It essentially fell off the back of the pumpkin trailer. The carrots weren’t actually new, either.

    Artists is aware that they are at best Some. Yet Mozart’s original artists hold that opinion.

    I have a creative side. I haven’t worked in advertising in 30 times, but my former artistic managers are the ones who make my hallucinations. They are correct to do that. When it really matters, my brain goes flat because I am too lazy and complacent. There is no treatment for artistic mania.

    I have a creative side. Every project I create has a goal that makes Indiana Jones appear to be a retiree snoring in a balcony head. The more I pursue creativity, the faster I can complete my work, and the longer I obsess over my ideas and whizz around in circles before I can complete that task.

    I can move ten times more quickly than those who aren’t imaginative, those who have just been creative for a short while, and those who have just had a short time of creative work. Only that I spend twice as long putting the work off as they do before I work ten times as quickly as they do. When I put my mind to it, I am so confident in my ability to do a great career. I am completely dependent on the excitement scramble of delay. I’m also so scared of jumping.

    I don’t create art.

    I have a creative side. hardly a performer. Though as a child, I had a dream that I would one day become that. Some of us criticize our abilities and fear our own accomplishments because we are not Michelangelos and Warhols. That is narcissism, but at least we don’t practice politicians.

    I have a creative side. Despite my belief in reason and science, I make decisions based on my own senses and instincts. and sit in the aftermath of both the successes and disasters.

    I have a creative side. Every term I’ve said these may irritate another artists who have different viewpoints. Ask a question to two artists, and three thoughts will be formed. Our dispute, our interest in it, and our commitment to our own truth, at least in my opinion, are the proof that we are creative, no matter how we does think about it.

    I have a creative side. I lament my lack of taste in almost all of the areas of human understanding, which I know very little about. And I put my flavor before everything else in the things that are most important to me, or perhaps more precisely, to my passions. Without my passions, I’d probably have to spend the majority of our time looking ourselves in the eye, which is something that almost none of us can do for very much. No actually. Actually, no. Because so much in existence is intolerable if you really look at it.

    I have a creative side. I think that when I am gone, some of the good parts of me will stay in the head of at least one additional person, just like a family does.

    Working frees me from worrying about my job.

    I have a creative side. I fear that my little product will disappear.

    I have a creative side. I’m too busy making the next thing to devote too much time to it, especially since practically everything I create did achieve the level of success I conceive of.

    I have a creative side. I think there is the greatest secret in the process. I think I have to think it so strongly that I actually made the foolish decision to publish an essay I wrote without having to go through or edit. I swear I didn’t do this frequently. But I did it right away because I was even more frightened of forgetting what I was saying because I was afraid of you seeing through my sad movements toward the wonderful.

    There. I believe I’ve said it.

  • 5 Ways Google Search Console Can Help Your SEO Strategy

    5 Ways Google Search Console Can Help Your SEO Strategy

    Jarret Redding’s article, 5 Ways Google Search Console You Support Your Marketing Plan can be found at Duct Tape Marketing.

    The Duct Tape Marketing Podcast with John Jantsch In this instance of the Duct Tape Marketing Podcast, I interviewed myself—John Jantsch, chairman of Duct Tape Marketing, in a single instance where I dig into one of the most misunderstood and underutilized Marketing tools: Google Search Console. While many entrepreneurs aspire to high rankings and pricey equipment, they […]…

    Jarret Redding’s article, 5 Ways Google Search Console You Support Your Marketing Plan can be found at Duct Tape Marketing.

    John Jantsch’s The Duct Tape Marketing Audio

    In this instance of the Duct Tape Marketing Podcast, I interviewed myself—John Jantsch, chairman of Duct Tape Marketing, in a single instance where I dig into one of the most misunderstood and underutilized Marketing resources: Google Search Console. While many marketers focus on ranking and pricey tools, they frequently overlook this completely, data-rich system that can significantly improve your search visibility, willing technique, and overall digital marketing performance.

    I break down five practical ways to use Google Search Console to increase your SEO metrics, much understand term performance, and create an organic traffic engine rooted in actual user intent. These advice will help you reevaluate your approach to search engine optimization with a emphasis on awareness, confidence, and conversion whether you’re a skilled SEO professional or a new owner of a small business.

    Important Restaurants:

    • Through performance reviews, discover person intent.
      Google Search Console helps you view the actual research queries bringing customers to your site. By filtering for sites with higher impressions but a small click-through level, you can discover missed opportunities for material efficiency and modify your metadata or headlines to increase relationship.

    • Mine Long-Tail Keywords for High-Intent Visitors
      Even if some questions have a small size, they frequently reveal a particular search intent. Optimizing articles around these terms—via blog comments, FAQs, or support pages—can travel higher-converting organic visitors.

    • Track Local and Branded scans
      Monitoring searches for your brand name or competitors ( e. g.,” Your Business + Reviews” ) uncovers how users validate companies. Create branded searches like stories, evaluation roundups, and Q&As using this information.

    • Focus on Visibility, Not Really Rankings
      Standard ranking factors change, but your website’s visibility and trust signals, such as full impressions, common CTR, and converting pages, provide a more accurate picture of how effective SEO is. Think in terms of appearance, no just position.

    • Use GSC Data to Control AI and Content Creation
      Feed your major concerns into tools like ChatGPT to create content strategy briefs and ideas that align with authentic customer searches. This helps you to create content that answers real-world questions and increases credibility.

    Chapters:

    • 00: 09 SEO is Not Dead
    • 02: 53 Opportunities for Intent and Content
    • 05: 12 Local and Branded Visibility Signals
    • 05: 53 Measuring Trust and Visibility
    • 07: 05 Fixing Technical Problems
    • 08: 06 AI Content Guiding

    John Jantsch ( 00: 02.039 )

    Hello and welcome to the newest episode of the Duck Tape Marketing podcast. This is John Jantsch and no guest today. I’m putting on a new solo show. Kind of on a rant here about new SEO. SEO is still alive. The old playbook is dead. SEO is still a very, very reliable channel. We just have to think about it completely differently. So I want to talk about a tool in this episode of this series about the new SEO, and I know that this is going to…

    For some people, this is going to be hard and dry and boring, but trust me, I’ll first off try to make it not boring, but also secondly, this is important. You need to pay attention to this because a lot of the reporting you receive or that agencies provide doesn’t really tell you much, which is one of the challenges in SEO. It maybe tells you rankings or some movement and keywords, but it doesn’t really tell you what to do.

    And if you know how to mine it, how to actually improve intent, and how to improve click through the things that are going to actually lead to conversion, the Google search console is actually the tool that tells you what to do. And let’s face it, that’s the point of all of this, right? Some of you old schoolers, you’ve been listening to me for a long time. Remember that this data that’s in Google search console used to be in Google analytics.

    We used to be able to see what searches we were ranking for and how much traffic each of these searches was generating. Now that’s just a big, nothing. They removed it all from there, not even telling you what you do. But it still lives in Google search console. Therefore, I believe it to be one of the least and most underappreciated tools. And here’s the beauty, it’s free. There are numerous tools built from this that cost$ 199 per month to build.

    If you just figure out how to actually tweak it and use it, it is a tool that can give you even better data, can inform decisions going forward so that you can improve your conversion, improve your business. So I’m going to go over five different ways that you might be using the Google search console. And don’t worry if you’re listening to this and you’re like, my head hurts. I’m not interested in learning more about a tool. This is something that we do as part of our

    John Jantsch ( 02: 24. 79 )

    Strategy first engagements as part of building a search visibility system rather than SEO. SVS search visibility is what I’m calling because that’s what really matters. That tool or that system is something that we can do for you. So, be careful not. If you hear something today, you’re like, that’s brilliant. I’m not going to do it. Just contact us, Duct Tape Marketing, and we will help you find the answers to making this work for you. So use, number one.

    Google Search Console to discover intent and content opportunities. So you need to first of all understand what people are actually searching for when they visit your website, and then you can align your future content with those real queries. So there are a number of sections in Google Search Console. I won’t be doing a tutorial, and I want to concentrate on the things you can do today that I believe you can do. But one of the sections is called performance. So if you attend a performance,

    search results queries, you’re going to be able to filter in pages, filter your pages that have impressions, but low click through rate. These are the only things listed there. It’ll be very obvious to you what those are. It displays the number of views that page received over the past era. But it also shows you the percentage of the people that saw that page that clicked on it. So it displays the content you’re looking for but doesn’t convert.

    And often that means because your title or your metadata or maybe the content itself showed up for Google interpreted the intent, but the user didn’t think that it meant what they were searching for. It simply, in essence, just gives you a roadmap to things you should be improving. You’re already showing up for those. You now know how to make that better, but how? You can mine for long tail searches. People today have become extremely good at searching, which is one of the things I find to be true.

    Exactly what they want. Because you can turn a book into AI and it will produce the results, AI is making this even worse. But people are doing longer searches, these very detailed searches. They frequently, in fact, almost always express their purpose in that search. And even though there may be very little volume for that, one of the things that I have found is sometimes when you look at pages that are getting traffic, what they’re ranking for, it might be 30 or 40 different search terms that you’re ranking for for that.

    John Jantsch ( 04: 49. 102 )

    page, there just isn’t much volume in it. Even if it’s 10 a month, the intention is so high that you start to really optimize your page for those with blog posts, FAQs, even Google business page Q&amp, As, you start to capture that in more places. And that’s high intent.

    using Google to generate local and branded visibility signals. So one of the things that you want to be paying attention to is are there searches for your name, your competitors names, you know, plus reviews or plus type of service. And what that does is it reveals that people are attempting to validate various kinds of businesses. So you create branded FAQs around those searches.

    testimonials, those that actually address those. So mine all of your reviews and look for, you know, put those on your website, getting those kind of search terms in there. Okay, I guess. Number three, use Google search console to measure the right metrics, not rankings, trust and visibility. So, stop ranking.

    and think in terms of search visibility, click through rates, and your top converting pages. Therefore, I’m going to start promoting clients ‘ increased online presence as an agency. Because to me, rankings bounce around every time there’s some update to the algorithm because of the way people try to manipulate them. However, your overall popularity is expanding as are the number of queries that are ranked among the top 10 queries.

    your average click through across all branded and non-branded terms. It really demonstrates how well your content in search engines earns users ‘ trust, and how quickly your brand impressions grow when you add new pages or sections. really overall presence, is something that, that, needs to be the new metric. Because of how much all of that add up to, it may sound a little fuzzy. but it’s the growth of it. I believe the key is in getting that number to go in the right direction.

    John Jantsch ( 07: 15.854 )

    Google Search Console can assist you with some technical problems. So indexing, crawlability issues, load time for your website. Therefore, you should definitely check out the pages Google excludes. Why? In some cases, you’ve instructed it to exclude those, but I’m finding that in many cases, they are unable to access entire pages or websites. You know, are there URLs that really

    Are the incorrect URLs a sign that you have updated your website and given new names? mean, people do things like that all the time. Is the site map updated, and is mobile performance stable? That is a huge, huge one. I am aware that it is. They never tell you what the ranking factors are, but I, I, I’ve seen it myself every time a site slows down because we haven’t been paying attention or we haven’t updated something or they’ve uploaded, somebody’s uploaded a giant image that makes the homepage take forever to load.

    rankings tank. there’s clearly a direct tie. Okay, the last one I want to discuss is.

    Guiding Google Search Console will help guide your AI Content or at least your AI content optimization, right? So that you know, you can input data directly from Google Search Console into chat GPT and then type in the phrase “know based on these queries from Google Search Console Give them a list of your top ten, right help create content briefs that are useful for creating content briefs.”

    that addresses these questions clearly and builds trust for a small business offering whatever service you offer. That kind of thing will give you such a head start when it comes to producing the right content, not just the right one. And that’s the best starting point. And yet again, I’ve only scratched the surface on what I believe are some of the key points that you should be focusing on with this incredibly underused tool.

    John Jantsch ( 09: 21.538 )

    This is a component of what we refer to as a search visibility system that we’ve developed as part of our client strategy work to help them really develop that entire content roadmap, to help them be focused on the right content, to really be focused on intent. And I think that that, if you take nothing else away from today, we need to be spending most of our content information around intent, around searches.

    branded searches for the things people are looking clearly indicate that we can attest that they are trying to buy something. Spend our time and effort there on conversions. Don’t worry about traffic. Worry about visibility and overall impressions. Okay, I guess. That’s it for today. Again, John at ducttapemarketing.com is the only one to contact if you need any assistance or want to learn more. I’d love to show you what we do and hopefully,

    Keep listening to the duct tape marketing podcast, where I’ll continue to discuss the new SEO, among other things, and hopefully we’ll see you one of these days out there on the road.

    powered by
  • From Beta to Bedrock: Build Products that Stick.

    From Beta to Bedrock: Build Products that Stick.

    I’ve lost count of the times when promising ideas go from being useless in a few days to being useless after working as a solution designer for too long to explain.

    Financial items, which is the area of my specialization, are no exception. It’s tempting to put as many features at the ceiling as possible and expect something sticks because people’s true, hard-earned money is on the line, user expectations are high, and crowded market. However, this strategy will lead to disaster. Why, you see this:

    The perils of feature-first creation

    It’s simple to get swept up in the enthusiasm of developing innovative features when you start developing a financial product from scratch or are migrating existing client journeys from paper or phone channels to online bank or mobile apps. You might be thinking,” If I can only put one more thing that solves this particular person problem, they’ll enjoy me”! What happens, however, when you eventually encounter a roadblock caused by your safety team? not like it? When a battle-tested film isn’t as well-known as you anticipated or when it fails due to unforeseen difficulty?

    The concept of Minimum Viable Product ( MVP ) comes into play in this area. Even if Jason Fried doesn’t usually refer to this concept, his book Getting Real and his audio Rework frequently discuss it. An MVP is a product that offers only sufficient value to your users to keep them interested, but not so much that it becomes difficult to keep up. Although the idea seems simple, it requires a razor-sharp eye, a brutal edge, and the courage to stand up for your position because” the Columbo Effect” makes it easy to fall for something when one always says” just one more thing …” to add.

    The issue with most fund apps is that they frequently turn out to be reflections of the company’s internal politics rather than an experience created exclusively for the customer. This implies that the priority should be given to delivering as many features and functionalities as possible in order to satisfy the requirements and needs of competing internal departments as opposed to crafting a compelling value statement that is focused on what people in the real world actually want. As a result, these products can very quickly became a mixed bag of misleading, related, and finally unhappy customer experiences—a feature salad, you might say.

    The significance of the foundation

    What’s a better course of action then? How may we create products that are user-friendly, firm, and, most importantly, stick?

    The concept of “bedrock” comes into play in this context. Rock is the main feature of your item that really matters to customers. It’s the fundamental building block that creates benefit and maintains relevance over time.

    The rock must be in and around the standard servicing journeys in the retail banking industry, which is where I work. People only look at their existing account once every blue sky, but they do so every day. They purchase a credit card every year or every other year, but they at least once a month assess their stability and pay their bills.

    The key is in identifying the main tasks that individuals want to complete and therefore persistently striving to make them simple, reliable, and trustworthy.

    But how do you reach the foundation? By focusing on the” MVP” strategy, giving ease the top priority, and working toward a distinct value proposition. This means avoiding pointless extras and putting your clients first, making the most of them.

    It also requires some nerve, as your coworkers might not always agree on your vision at first. And in some cases, it might even mean making it clear to consumers that you won’t be coming over to their home and prepare their meal. Sometimes you may need to use the sporadic “opinionated user interface design” ( i .e. clunky workaround for edge cases ) to test a concept or to give yourself some room to work on something more crucial stuff.

    Realistic methods for creating financially successful products

    What are the main learnings I’ve made from my own research and practice, then?

    1. What issue are you attempting to resolve first, and why? Whom? Before beginning any project, make sure your vision is completely clear. Make certain it also aligns with the goals of your business.
    2. Avoid putting too many features on the list at again; instead, focus on getting that right first. Choose one that actually adds price, and work from that.
    3. Give clarity the precedence it deserves over difficulty when it comes to financial products. Eliminate unwanted details and concentrate on what matters most.
    4. Accept constant iteration as Bedrock is a powerful process rather than a set destination. Continuously collect customer feedback, make product improvements, and advance in that direction.
    5. Halt, look, and listen: You don’t just have to test your product during the delivery process; you must also test it consistently in the field. Use it for yourself. Work A/B tests. User opinions on Gatter. Speak to users and make adjustments accordingly.

    The rock dilemma

    This is an intriguing conundrum: sacrificing some of the potential for short-term growth in favor of long-term stability is at play. But the payoff is worthwhile because products built with a emphasis on bedrock will outlive and surpass their rivals over time and provide users with long-term value.

    How do you begin your quest to rock, then? Take it slowly. Start by identifying the underlying factors that your customers actually care about. Focus on developing and improving a second, potent have that delivers real value. And most importantly, check constantly because, whatever you think, Abraham Lincoln, Alan Kay, or Peter Drucker are all in the same boat! The best way to foretell the future is to make it, he said.

  • Disney+ New Releases: May 2025

    Disney+ New Releases: May 2025

    May 4 marks the start of Star Wars Day, so Disney + has some exciting new line and offers to celebrate all month long. Episodes 7-9 and 10-12 of Andor’s final six shows are scheduled for release this month on the channel, with shows 7-9 scheduled for May 6 and 10-12 scheduled for May 13. By far, this compelling play is…

    May 2025: Disney + New Releases on Den of Geek first appeared.

  • Amazon Prime Video New Releases: May 2025

    Amazon Prime Video New Releases: May 2025

    This month, Prime Video has a lot to sell moviegoers. Another Easy Favour, the cult-favorite sequel to A Simple Favour, will be available on Prime Video on May 1. Emily Nelson and Stephanie Buries are portrayed by Blake Lively and Anna Kendrick in their lavish weddings in Capri, Italy, both.

    May 2025, the first blog in the Amazon Prime Video New Releases series, first appeared on Den of Geek.

  • Max Minghella Explains Nick’s Choice in The Handmaid’s Tale: ‘He’s a Really Good Survivor’

    Max Minghella Explains Nick’s Choice in The Handmaid’s Tale: ‘He’s a Really Good Survivor’

    Warning: There are trailers for season six of The Handmaid’s Tale, episode six,” Shock.” Since Nick Blaine’s early days were unknown, the internal workings of his mind have always been a mystery. Is he a member of Gilead’s prison or a member of its men? The personality has soared astronomically through the nation’s divisions over the past six months, ]…

    The second article on Den of Geek was Max Minghella Explains Nick’s Choice in” He’s a Really Good Survivor.”