Category: Blog

Your blog category

  • Mobile-First CSS: Is It Time for a Rethink?

    Mobile-First CSS: Is It Time for a Rethink?

    The mobile-first design approach is excellent because it concentrates on what the consumer truly needs, is well-practiced, and has become a standard design practice for years. But developing your CSS mobile-first should also be fantastic, too…right?

    Well, not necessarily. Classic mobile-first CSS development is based on the principle of overwriting style declarations: you begin your CSS with default style declarations, and overwrite and/or add new styles as you add breakpoints with min-width media queries for larger viewports (for a good overview see “What is Mobile First CSS and Why Does It Rock?”). But all those exceptions create complexity and inefficiency, which in turn can lead to an increased testing effort and a code base that’s harder to maintain. Admit it—how many of us willingly want that?

    Mobile-first CSS may yet be the best option for your own projects, but you need to first determine how ideal it is in light of the physical design and user interactions you’re working on. To help you get started, here’s how I go about tackling the elements you need to watch for, and I’ll discuss some alternative remedies if mobile-first doesn’t seem to fit your job.

    benefits of mobile-first technology

    Some of the benefits of mobile-first CSS creation, and why it’s been the de facto growth practice for so long, make a lot of sense:

    Development order. A good development hierarchy is one thing you definitely get from mobile-first; you just concentrate on the cellular view and start developing.

    tested and verified. It’s a tried and tested technique that’s worked for years for a cause: it solves a problem actually also.

    emphasizes the portable viewpoint. The smart watch is the simplest and perhaps the most crucial because it covers all of the crucial user journeys and frequently accounts for more user visits ( depending on the project ) in terms of both simple and crucial aspects.

    Stops desktop-centric growth. It can be tempting to first focus on the desktop perspective because enhancement is done using pc servers. However, considering mobile from the beginning prevents us from getting stuck eventually; no one wants to spend their time getting a site that is focused on desktops to operate on mobile devices!

    Drawbacks of mobile-first

    Kind declarations can be set at higher breakpoints and therefore overwritten at higher breakpoints:

    more complicated stuff. The farther up the target order you go, the more unnecessary script you inherit from lower thresholds.

    higher CSS precision A school name declaration’s default style has then a higher specificity that has been returned to the browser’s default value. This can be a pain on big projects when you want to preserve the CSS candidates as simple as possible.

    Requires more analysis tests. All higher thresholds must be regression tested if modifications to CSS at a lower see ( such as adding a new style ) are required.

    The browser can’t prioritize CSS downloads. At wider breakpoints, classic mobile-first min-width media queries don’t leverage the browser’s capability to download CSS files in priority order.

    Property price issue: override

    There is nothing inherently wrong with overwriting beliefs, CSS was designed to do just that. However, inheriting incorrect values can be laborious and ineffective, which is counterproductive. When you have to replace styles to restore them back to their defaults, which may cause issues after, especially if you are using a combination of bespoke CSS and power classes, it can also lead to more fashion precision. We won’t be able to use a power school for a design that has been restore with a higher precision.

    With this in mind, I’m developing CSS with a focus on the default values much more these days. Since there’s no specific order, and no chains of specific values to keep track of, this frees me to develop breakpoints simultaneously. I concentrate on finding common styles and isolating the specific exceptions in closed media query ranges (that is, any range with a max-width set). 

    As you can view each target as a blank slate, this technique opens up some opportunities. It’s acceptable and can be coded in the default style plate if a product’s structure appears to be based on Flexbox at all thresholds. But if it looks like Grid would be much better for big screens and Flexbox for portable, these can both be done entirely freely when the CSS is put into finished media keyword ranges. Additionally, having a thorough understanding of any given component in all breakpoints upfront is necessary for developing simultaneously. This can help identify design flaws earlier in the development process. We don’t want to get stuck down a rabbit hole building a complex component for mobile, and then get the designs for desktop and find they are equally complex and incompatible with the HTML we created for the mobile view!

    Although this strategy won’t work for everyone, I urge you to try it. There are plenty of tools available to support concurrent development, including Responsively App, Blisk, and many others.

    Having said that, I don’t feel the order itself is particularly relevant. Stick to the classic development order if you like to concentrate on the mobile view, understand the requirements for other breakpoints, and prefer to work on multiple devices at once. The key is to find common styles and exceptions so that you can include them in the appropriate stylesheet, which is a manual tree-shaking procedure! Personally, I find this a little easier when working on a component across breakpoints, but that’s by no means a requirement.

    In practice, closed media query ranges

    We overwrite the styles in the traditional mobile-first CSS, but media query ranges can be used to prevent this. To illustrate the difference ( I’m using SCSS for brevity ), let’s assume there are three visual designs:

    • smaller than 768
    • from 768 to less than 1024
    • 1024 and anything larger

    Take a simple example where a block-level element has a default padding of “20px,” which is overwritten at tablet to be “40px” and set back to “20px” on desktop.

    Classic min-width mobile-first

    .my-block { padding: 20px; @media (min-width: 768px) { padding: 40px; } @media (min-width: 1024px) { padding: 20px; }}

    Closed media query range

    .my-block { padding: 20px; @media (min-width: 768px) and (max-width: 1023.98px) { padding: 40px; }}

    The subtle difference is that the mobile-first example sets the default padding to “20px” and then overwrites it at each breakpoint, setting it three times in total. In contrast, the second example sets the default padding to “20px” and only overrides it at the relevant breakpoint where it isn’t the default value (in this instance, tablet is the exception).

    The goal is to: 

    • Only set styles when needed. 
    • Not set them with the expectation of overwriting them later on, again and again. 

    To this end, closed media query ranges are our best friend. If we need to make a change to any given view, we make it in the CSS media query range that applies to the specific breakpoint. We’ll be much less likely to introduce unwanted alterations, and our regression testing only needs to focus on the breakpoint we have actually edited. 

    Taking the above example, if we find that .my-block spacing on desktop is already accounted for by the margin at that breakpoint, and since we want to remove the padding altogether, we could do this by setting the mobile padding in a closed media query range.

    .my-block {  @media (max-width: 767.98px) {    padding: 20px;  }  @media (min-width: 768px) and (max-width: 1023.98px) {    padding: 40px;  }}

    The browser default padding for our block is “0,” so instead of adding a desktop media query and using unset or “0” for the padding value (which we would need with mobile-first), we can wrap the mobile padding in a closed media query (since it is now also an exception) so it won’t get picked up at wider breakpoints. At the desktop breakpoint, we won’t need to set any padding style, as we want the browser default value.

    Bundling versus separating the CSS

    Back in the day, keeping the number of requests to a minimum was very important because the browser's concurrent request limit (typically around six ) was high. In consequence, using image sprites and CSS bundling was the norm, with all the CSS being downloaded as a single stylesheet with the highest priority.

    With HTTP/2 and HTTP/3 now on the scene, the number of requests is no longer the big deal it used to be. This enables us to use a media query to break CSS into multiple files. The obvious benefit of this is that the browser can now request the CSS it currently requires with a higher priority than the CSS it doesn't. This is more performant and can reduce the overall time page rendering is blocked.

    What version of HTTP do you use?

    Go to your website and open your browser's dev tools to find out which version of HTTP you're using. Next, select the Network tab and make sure the Protocol column is visible. If "h2" is included in the protocol list, that indicates that HTTP/2 is being used.

    Note: To check the Protocol column in your browser's dev tools, right-click any column header ( such as Name ), go to the Network tab, reload your page, and then check the Protocol column.

    Also, if your website is still using HTTP/1... WHHY!! What are you waiting for? The HTTP/2 user support is excellent.

    CSS should be split.

    Separating the CSS into individual files is a worthwhile task. Linking the separate CSS files using the relevant media attribute allows the browser to identify which files are needed immediately (because they’re render-blocking) and which can be deferred. Based on this, it allocates each file an appropriate priority.

    In the following example of a website visited on a mobile breakpoint, we can see the mobile and default CSS are loaded with" Highest" priority, as they are currently needed to render the page. In case they are needed later, the remaining CSS files ( print, tablet, and desktop ) are still being downloaded with" Lowest" priority, though they are still needed.

    Before rendering can begin, the browser will need to download the CSS file and parse it using bundled CSS before rendering can begin.

    While, as noted, with the CSS separated into different files linked and marked up with the relevant media attribute, the browser can prioritize the files it currently needs. Using closed media query ranges allows the browser to do this at all widths, as opposed to classic mobile-first min-width queries, where the desktop browser would have to download all the CSS with Highest priority. We can’t assume that desktop users always have a fast connection. For instance, in many rural areas, internet connection speeds are still slow. 

    Depending on project requirements, the media queries and the number of separate CSS files may vary from one project to the next, but the example below may look similar.

    bundled CSS



    This single file contains all the CSS, including all media queries, and it will be downloaded with Highest priority.

    CSS that is separated



    Separating the CSS and specifying a media attribute value on each link tag allows the browser to prioritize what it currently needs. Out of the five files listed above, two will be downloaded with Highest priority: the default file, and the file that matches the current media query. The others will be downloaded with Lowest priority.

    Depending on the project’s deployment strategy, a change to one file (mobile.css, for example) would only require the QA team to regression test on devices in that specific media query range. Compare that to the prospect of deploying the single bundled site.css file, an approach that would normally trigger a full regression test.

    Moving on

    The adoption of mobile-first CSS was a significant development milestone because it allowed front-end developers to concentrate on mobile web applications rather than creating websites for desktop use and attempting to retrofit them to work on other devices.

    I don't think anyone wants to return to that development model again, but it's important we don't lose sight of the issue it highlighted: that things can easily get convoluted and less efficient if we prioritize one particular device—any device—over others. For this reason, it seems natural to concentrate on the CSS in its own right, keeping an eye on both the default setting and the exceptions. I've started to notice subtle simplifications in both the CSS and other developers', and that the work is also a little more organized and effective.

    In general, simplifying CSS rule creation whenever we can is ultimately a cleaner approach than going around in circles of overrides. However, whatever method you use, it must be appropriate for the project. Mobile-first may turn out to be the best option for the situation at hand or not, but first you need to fully comprehend the trade-offs you're entering.

  • Humility: An Essential Value

    Humility: An Essential Value

    Humility, a writer’s most important quality, has a great circle to it. What about sincerity, an business manager’s important value? Or a doctor’s? Or a teacher’s? They all have fantastic sounds. When humility is our guiding light, the course is usually available for fulfillment, development, relation, and commitment. We’re going to discuss why in this book.

    That said, this is a guide for developers, and to that conclusion, I’d like to begin with a story—well, a voyage, actually. It’s a personal one, and I’m going to make myself susceptible as well. I call it:

    The Absurd Pate of Justin: The Tale of Justin

    When I was coming out of arts school, a long-haired, goateed novice, write was a known quantity to me, design on the web, however, was riddled with complexities to understand and learn, a problem to be solved. Although I had formal training in typography, layout, and creative design, what most intrigued me was how these traditional skills could be applied to a young online landscape. This theme may eventually form the rest of my job.

    But I drained HTML and JavaScript novels into the wee hours of the morning and self-taught myself how to code during my freshman year rather than student and go into print like many of my companions. I wanted—nay, needed—to better understand the underlying relevance of what my design decisions may think when rendered in a website.

    The so-called” Wild West” of website layout existed in the late 1990s and the early 2000s. Manufacturers at the time were all figuring out how to use layout and visual connection to the online environment. What were the guidelines? How may we break them and also engage, entertain, and present information? How was my values, which include modesty, respect, and connection, coincide with that on a more general level? I was eager to find out.

    Those are classic factors between non-career relationships and the world of design, even though I’m talking about a different time. What are your main passions, or ideals, that elevate medium? The main themes remain the same, much like the clear parallels between what fulfills you, who is independent of the physical or digital worlds.

    First within tables, animated GIFs, Flash, then with Web Standards, divs, and CSS, there was personality, raw unbridled creativity, and unique means of presentment that often defied any semblance of a visible grid. Splash screens and “browser requirement” pages aplenty. Usability and accessibility were typically victims of such a creation, but such paramount facets of any digital design were largely (and, in hindsight, unfairly) disregarded at the expense of experimentation.

    For instance, this iteration of my personal portfolio site (” the pseudoroom” ) from that time was experimental if not a little overt in terms of visualizing how the idea of a living sketchbook was conveyed. Quite skeuomorphic. On this one, I worked with fellow artist and dear buddy Marc Clancy, who is now a co-founder of the creative task organizing app Milanote, to outline and then play with various user interactions. Finally, I’d break it down and script it into a modern layout.

    Along with pattern book pieces, the site even offered free downloads for Mac OS customizations: pc wallpapers that were successfully style experimentation, custom-designed typefaces, and desktop icons.

    GUI Galaxy was a design, pixel art, and Mac-centric news portal that graphic designer friends and I developed from around the same time.

    Design news portals were incredibly popular at the time, and they now accept tweet-sized, small-format excerpts from relevant news from the categories I previously covered. If you took Twitter, curated it to a few categories, and wrapped it in a custom-branded experience, you’d have a design news portal from the late 90s / early 2000s.

    We as designers had changed and developed a bandwidth-sensitive, award-winning, much more accessibility-conscious website. Still ripe with experimentation, yet more mindful of equitable engagement. There are a few content panes here, with both Mac-focused news and general news (tech, design ) to be seen. We also offered many of the custom downloads I cited before as present on my folio site but branded and themed to GUI Galaxy.

    The presentation layer of the website’s backbone was made up of global design + illustration + news author collaboration. The backbone was a homegrown CMS. And the collaboration effort here, in addition to experimentation on a’ brand’ and content delivery, was hitting my core. We were creating a larger-than-anyone experience and establishing a global audience.

    Collaboration and connection transcend medium in their impact, immensely fulfilling me as a designer.

    Now, why am I taking you on this trip through design memory lane? Two reasons.

    First of all, there’s a reason for the nostalgia for that design era ( the” Wild West” era, as I put it ): the inherent exploration, personality, and creativity that dominated many design portals and personal portfolio websites. Ultra-finely detailed pixel art UI, custom illustration, bespoke vector graphics, all underpinned by a strong design community.

    The web design industry has experienced stagnation in recent years. I suspect there’s a strong chance you’ve seen a site whose structure looks something like this: a hero image / banner with text overlaid, perhaps with a lovely rotating carousel of images ( laying the snark on heavy there ), a call to action, and three columns of sub-content directly beneath. Perhaps there are selections that vaguely relate to their respective content in an icon library.

    Design, as it’s applied to the digital landscape, is in dire need of thoughtful layout, typography, and visual engagement that goes hand-in-hand with all the modern considerations we now know are paramount: usability. accessibility. Load times and bandwidth- sensitive content delivery. A user-friendly presentation that connects with people wherever they are. We must be mindful of, and respectful toward, those concerns—but not at the expense of creativity of visual communication or via replicating cookie-cutter layouts.

    Pixel Issues

    Websites during this period were often designed and built on Macs whose OS and desktops looked something like this. Although Mac OS 7.5 is available, 8 and 9 are not very different.

    How could any single icon, at any point, stand out and grab my attention? This fascinated me. In this example, the user’s desktop is tidy, but think of a more realistic example with icon pandemonium. Or, let’s say an icon was a part of a larger system grouping ( fonts, extensions, control panels ): how did it maintain cohesion within a group as well?

    These were 32 x 32 pixel creations, utilizing a 256-color palette, designed pixel-by-pixel as mini mosaics. This seemed to me to be the embodiment of digital visual communication under such absurd restrictions. And often, ridiculous restrictions can yield the purification of concept and theme.

    So I started to research and do my homework. I was a student of this new medium, hungry to dissect, process, discover, and make it my own.

    I wanted to see how I could use that 256-color palette to push the boundaries of a 32×32 pixel grid, expanding upon the idea of exploration. Those ridiculous constraints forced a clarity of concept and presentation that I found incredibly appealing. I was thrust into the digital gauntlet because of it. And so, in my dorm room into the wee hours of the morning, I toiled away, bringing conceptual sketches into mini mosaic fruition.

    These are some of my creations that made use of ResEdit, the only program I had at the time, to create icons. ResEdit was a clunky, built-in Mac OS utility not really made for exactly what we were using it for. Research is at the center of all of this endeavor. Challenge. Problem-solving Again, these core connection-based values are agnostic of medium.

    There’s one more design portal I want to talk about, which also serves as the second reason for my story to bring this all together.

    This is the Kaliber 1000, or K10k, abbreviated. K10k was founded in 1998 by Michael Schmidt and Toke Nygaard, and was the design news portal on the web during this period. It was the place to be, my friend, with its pixel art-fueled presentation, ultra-focused care given to every aspect of every detail, and many of the more influential designers of the time who were invited to be news authors on the site. With respect where respect is due, GUI Galaxy’s concept was inspired by what these folks were doing.

    For my part, the combination of my web design work and pixel art exploration began to get me some notoriety in the design scene. K10k eventually figured out and added me as one of their very limited group of news writers to add content to the website.

    Amongst my personal work and side projects —and now with this inclusion—in the design community, this put me on the map. Additionally, my design work has started to appear on other design news portals, as well as be published in various printed collections, in domestic and international magazines, and in various printed collections. With that degree of success while in my early twenties, something else happened:

    I actually changed into a colossal asshole in about a year of school, not less. The press and the praise became what fulfilled me, and they went straight to my head. My ego was inflated by them. I actually felt somewhat superior to my fellow designers.

    The victims? My design stagnated. My evolution has stagnated, as is its evolution.

    I felt so supremely confident in my abilities that I effectively stopped researching and discovering. When I used to lead myself to iterate through concepts or sketches, I leaped right into Photoshop. I drew my inspiration from the smallest of sources ( and with blinders on ). My peers frequently vehemently disapproved of any criticism of my work. The most tragic loss: I had lost touch with my values.

    Some of my friendships and blossoming professional relationships almost ended up being destroyed by my ego. I was toxic in talking about design and in collaboration. But thankfully, those same friends gave me a priceless gift: candor. They called me out on my unhealthy behavior.

    Although it was something I initially rejected, I eventually had a chance to reflect on it in depth. I was soon able to accept, and process, and course correct. Although the re-awakening was necessary, the realization let me down. I let go of the “reward” of adulation and re-centered upon what stoked the fire for me in art school. Most importantly, I returned to my fundamental values.

    Always Students

    Following that temporary decline, my personal and professional design journey advanced. And I could self-reflect as I got older to facilitate further growth and course correction as needed.

    Let’s take the Large Hadron Collider as an example. The LHC was designed” to help answer some of the fundamental open questions in physics, which concern the basic laws governing the interactions and forces among the elementary objects, the deep structure of space and time, and in particular the interrelation between quantum mechanics and general relativity”. Thank you, Wikipedia.

    Around fifteen years ago, in one of my earlier professional roles, I designed the interface for the application that generated the LHC’s particle collision diagrams. These diagrams are often regarded as works of art unto themselves because they depict what is actually happening inside the Collider during any given particle collision event.

    Designing the interface for this application was a fascinating process for me, in that I worked with Fermilab physicists to understand what the application was trying to achieve, but also how the physicists themselves would be using it. In order to accomplish this, in this role,

    I cut my teeth on usability testing, working with the Fermilab team to iterate and improve the interface. To me, how they spoke and what they talked about was like an alien tongue. And by making myself humble and working under the mindset that I was but a student, I made myself available to be a part of their world to generate that vital connection.

    I also had my first ethnographic observational experience, which involved visiting the Fermilab location and observing how the physicists used the tool in their own environments, on their own terminals. For example, one takeaway was that due to the level of ambient light-driven contrast within the facility, the data columns ended up using white text on a dark gray background instead of black text-on-white. They could read through a lot of data at once and relieve their strain in the process. And Fermilab and CERN are government entities with rigorous accessibility standards, so my knowledge in that realm also grew. Another crucial form of communication was the barrier-free design.

    So to those core drivers of my visual problem-solving soul and ultimate fulfillment: discovery, exposure to new media, observation, human connection, and evolution. Before I entered those values, I checked my ego before entering the door.

    An evergreen willingness to listen, learn, understand, grow, evolve, and connect yields our best work. I want to pay attention to the phrases “grow” and “evolve” in particular. If we are always students of our craft, we are also continually making ourselves available to evolve. Yes, we have years of practical design experience under our belt. Or the focused lab sessions from a UX bootcamp. or the work portfolio with monograms. Or, ultimately, decades of a career behind us.

    However, remember that “experience” does not equate to “expert.”

    As soon as we close our minds via an inner monologue of’ knowing it all’ or branding ourselves a” #thoughtleader” on social media, the designer we are is our final form. The creator who we can be will never be there.

  • Personalization Pyramid: A Framework for Designing with User Data

    Personalization Pyramid: A Framework for Designing with User Data

    In tomorrow’s data-driven environment, it’s becoming more and more common for a UX specialist to be asked to create a personal digital experience, whether it’s a common website, user portal, or local application. But while there continues to be no lack of marketing buzz around personalization systems, we also have very few defined approaches for implementing personalized UX.

    That’s where we come in. After completing tens of personalisation projects over the past few years, we gave ourselves a purpose: could you make a systematic personalization platform especially for UX practitioners? A human-centered personalization program can be established using the Personalization Pyramid, which covers information, classification, content delivery, and overall objectives. By using this strategy, you will be able to understand the core components of a modern, UX-driven personalization system ( or at the very least understand enough to get started ).

    Getting Started

    We’ll assume that you are already comfortable with the fundamentals of modern personalization for the purposes of this article. A nice guide can be found these: Website Personalization Planning. Although Graphic projects in this field can take a variety of forms, they frequently begin with identical starting points.

    Common scenarios for starting a personalisation task:

    • Your business or client made a purchase to support personalization with a content management system ( CMS ), marketing automation platform ( MAP ), or other related technology.
    • The CMO, CDO, or CIO has identified personalisation as a target
    • User data is unclear or disjointed.
    • You are running some secluded targeting strategies or A/B tests
    • On the personalisation method, stakeholders disagree.
    • Mandate of customer privacy rules ( e. g. GDPR ) requires revisiting existing user targeting practices

    Regardless of where you begin, a powerful personalization system will require the same key creating stones. These are the “levels” on the tower, which we have identified. Whether you are a UX artist, scholar, or planner, understanding the core components may help make your contribution effective.

    From top to bottom, the rates include:

      North Star: What larger geopolitical goal is the personalisation initiative pursuing?
    1. Objectives: What are the specific, tangible benefits of the system?
    2. Touchpoints: Where will you get a personal knowledge?
    3. Contexts and Campaigns: What personalization information does the person view?
    4. What constitutes a distinct, suitable audience? User Parts
    5. Actionable Data: What dependable and credible information is captured by our professional platform to generate personalization?
    6. What wider set of data is conceivable ( now in our environment ) to allow you to optimize?

    We’ll go through each of these amounts in change. An associated deck of cards was created to highlight specific examples from each level to make this more meaningful. We’ve found them helpful in customisation pondering periods, and will include cases for you here.

    Starting at the Top

    The elements of the pyramids are as follows:

    North Star

    With your customisation plan, whether large or small, you aim for a general north star. The North Star defines the (one ) overall mission of the personalization program. What do you hope to accomplish? North Stars cast a ghost. The darkness is bigger the sun the bigger the sun. Example of North Starts may incorporate:

      Function: Personalized based on fundamental consumer sources. Examples:” Raw” messages, basic search effects, system user settings and settings options, general flexibility, basic improvements
    1. Feature: Self-contained personalisation component. Examples:” Cooked” notifications, advanced optimizations ( geolocation ), basic dynamic messaging, customized modules, automations, recommenders
    2. User knowledge: Personal consumer experiences across various user flows and interactions. Examples: Email campaigns, landing pages, advanced messaging ( i. e. C2C chat ) or conversational interfaces, larger user flows and content-intensive optimizations ( localization ).
    3. Solution: Highly distinctive, personalized solution experiences. Example: Standalone, branded encounters with personalization at their base, like the “algotorial” songs by Spotify quite as Discover Weekly.

    Goals

    Personalization can aid in developing with client intentions, just like it does with any great UX design. Goals are the military and quantifiable metrics that may prove the entire program is effective. Start with your existing analytics and measurement system, as well as indicators you can benchmark against. In some cases, new targets may be suitable. The most important thing to consider is that personalisation is more of a means of achieving an objective than a desired result. Popular targets include:

    • Conversion
    • Time spent on work
    • Net promoter score ( NPS)
    • Satisfaction of the customers

    Touchpoints

    Touchpoints are where the personalisation happens. This will be one of your biggest areas of responsibility as a UX custom. The connections available to you will depend on how your personalization and associated technology features are instrumented, and should be rooted in improving a person’s experience at a certain point in the trip. Touchpoints can be multi-device ( mobile, in-store, website ), but they can also be more specific ( web banner, web pop-up, etc. ). Here are a few illustrations:

    Channel-level touchpoints

    • Email: Role
    • Email opens at what time?
    • In-store display ( JSON endpoint )
    • Native app
    • Search

    Wireframe-level Touchpoints

    • Web overlay
    • Web alert bar
    • Web banner
    • Web content block
    • menu on the web

    If you’re designing for web interfaces, for example, you will likely need to include personalized “zones” in your wireframes. Based on our next step, context, and campaigns, the content for these can be presented programmatically in touchpoints.

    Contexts and Campaigns

    Once you’ve identified some touchpoints, you can decide what kind of personalized content a user will receive. Many personalization tools will refer to these as” campaigns” ( so, for example, a campaign on a web banner for new visitors to the website ). These will be displayed to specific user segments programmatically, as defined by user data. At this stage, we find it helpful to consider two separate models: a context model and a content model. The context helps you consider whether a user is engaging with the personalization process at the moment, such as when they are simply browsing the web or engaging in a deep dive. Think of it in terms of information retrieval behaviors. The content model can then guide you in deciding which personalization to use in terms of the context ( for instance, an” Enrich” campaign that features related articles might be a good substitute for extant content ).

    Personalization Context Model:

    1. Browse
    2. Skim
    3. Nudge
    4. Feast

    Content model for personalization:

    1. Alert
    2. Make Easier
    3. Cross-Sell
    4. Enrich

    If you’d like to read more about each of these models, check out Colin’s Personalization Content Model and Jeff’s Personalization Context Model.

    User Groups

    User segments can be created prescriptively or adaptively, based on user research ( e. g. via rules and logic tied to set user behaviors or via A/B testing ). You will need to think about how to treat the logged-in visitor, the guest or returning visitor for whom you may have a stateful cookie ( or another post-cookie identifier ), or the authenticated visitor who is logged in at the very least. Here are some examples from the personalization pyramid:

    • Unknown
    • Guest
    • Authenticated
    • Default
    • Referred
    • Role
    • Cohort
    • Unique ID

    Actionable Data

    Every organization with any digital presence has data. It’s important to inquire about how to use the data you can ethically collect on users, its inherent reliability and value, and how to use it ( sometimes referred to as “data activation” ). Fortunately, the tide is turning to first-party data: a recent study by Twilio estimates some 80 % of businesses are using at least some type of first-party data to personalize the customer experience.

    First-party data has a number of benefits for the user experience, including being relatively simple to collect, more likely to be accurate, and less susceptible to the” creep factor” of third-party data. So a key part of your UX strategy should be to determine what the best form of data collection is on your audiences. Here are a few illustrations:

    There is a progression of profiling when it comes to recognizing and making decisioning about different audiences and their signals. As user numbers increase in terms of time, confidence, and data volume, it varies more granularly.

    While some combination of implicit / explicit data is generally a prerequisite for any implementation ( more commonly referred to as first party and third-party data ) ML efforts are typically not cost-effective directly out of the box. This is because optimization requires a strong content repository and data backbone. But these approaches should be considered as part of the larger roadmap and may indeed help accelerate the organization’s overall progress. You’ll typically work together to create a profiling model with key stakeholders and product owners. The profiling model includes defining approach to configuring profiles, profile keys, profile cards and pattern cards. a scalable, multi-faceted approach to profiling.

    Pulling it Together

    The cards serve as the foundation for an inventory of sorts ( we provide blanks for you to tailor your own ), a set of potential levers and motivations for the kind of personalization activities you aspire to deliver, but they are more valuable when grouped together.

    In assembling a card “hand”, one can begin to trace the entire trajectory from leadership focus down through a strategic and tactical execution. It serves as the foundation for the workshops that both co-authors have conducted to build a program backlog, which would make a good article topic.

    In the meantime, what is important to note is that each colored class of card is helpful to survey in understanding the range of choices potentially at your disposal, it is threading through and making concrete decisions about for whom this decisioning will be made: where, when, and how.

    Lay Down Your Cards

    Any effective personalization plan must take into account near, middle, and long-term objectives. Even with the leading CMS platforms like Sitecore and Adobe or the most exciting composable CMS DXP out there, there is simply no “easy button” wherein a personalization program can be stood up and immediately view meaningful results. Having said that, all personalization activities follow a common grammar, similar to how every sentence contains nouns and verbs. These cards attempt to map that territory.

  • User Research Is Storytelling

    User Research Is Storytelling

    I’ve been fascinated by shows since I was a child. I loved the heroes and the excitement—but most of all the stories. I aspired to be an artist. And I believed that I’d get to do the things that Indiana Jones did and go on fascinating experiences. Yet my friends and I had movie ideas to make and sun in. But they never went any farther. However, I did end up in the user experience ( UX) field. Today, I realize that there’s an element of drama to UX— I hadn’t actually considered it before, but consumer research is story. And to get the most out of customer studies, you must tell a compelling story that involves stakeholders, including the product team and decision-makers, and piques their interest in learning more.

    Think of your favorite film. It more than likely follows a three-act narrative construction: the installation, the turmoil, and the resolution. The second act shows what exists now, and it helps you get to know the characters and the challenges and problems that they face. Act two sets the scene for the fight and introduces the action. Here, difficulties grow or get worse. The solution is the third and final work. This is where the issues are resolved and the figures learn and change. This structure, in my opinion, is also a fantastic way to think about consumer research, and it might be particularly useful for explaining user research to others.

    Use story as a framework when conducting analysis.

    It’s sad to say, but many have come to view studies as being inconsequential. Research is frequently one of the first things to go when expenses or deadlines are tight. Instead of investing in study, some goods professionals rely on manufacturers or—worse—their personal judgment to make the “right” options for users based on their experience or accepted best practices. That may lead some groups, but that approach can so easily miss the chance to solve people ‘ real issues. To be user-centered, this is something we really avoid. Design is enhanced by customer research. It keeps it on record, pointing to problems and opportunities. Being aware of the problems with your goods and taking action can help you be ahead of your competition.

    In the three-act structure, each action corresponds to a part of the process, and each part is important to telling the whole story. Let’s take a look at the various functions and how they relate to customer research.

    Act one: layout

    Fundamental analysis comes in handy because the setup is all about comprehending the background. Basic research ( also called conceptual, discovery, or original research ) helps you understand people and identify their problems. Like in the movies, you’re learning about the problems users face, what options are available, and how they are affected by them. To do basic research, you may conduct cultural inquiries or journal studies ( or both! ), which may assist you in identifying both problems and opportunities. It doesn’t need to be a great investment in time or money.

    Erika Hall discusses the most effective anthropology, which can be as straightforward as spending 15 hours with a customer and asking them to” Walk me through your morning yesterday.” That’s it. Provide that one ask. Locked up and listen to them for 15 days. Do everything in your power to keep yourself and your pursuits out of it. Bam, you’re doing ethnography”. Hall predicts that “[This ] will definitely prove quite fascinating. In the very unlikely event that you didn’t learn anything new or helpful, carry on with increased confidence in your way”.

    I think this makes sense. And I love that this makes consumer studies so visible. You can only attract participants and do it! You don’t need to make a lot of documentation. This can offer a wealth of knowledge about your customers, and it’ll help you better understand them and what’s going on in their life. Understanding where people are coming from is what action one is really all about.

    Maybe Spool talks about the importance of basic research and how it may type the bulk of your research. If you can supplement what you’ve heard in the basic studies by using any more user data that you can obtain, such as surveys or analytics, to make recommendations that may need to be investigated further, you might as well use those that can be drawn from those that you can obtain. Together, all this information creates a clearer picture of the state of things and all its inadequacies. And that’s the start of a gripping tale. It’s the place in the story where you realize that the principal characters—or the people in this case—are facing issues that they need to conquer. This is where you begin to develop compassion for the characters and support their success, much like in films. And finally participants are now doing the same. Their business may lose money because users can’t finish particular tasks, which may be their love. Or probably they do connect with customers ‘ problems. In either case, work one serves as your main strategy for piqueing interest and investment from the participants.

    When partners begin to understand the value of basic research, that is open doors to more opportunities that involve users in the decision-making approach. And that can influence product team ‘ focus on improving. This rewards everyone—users, the goods, and partners. It’s similar to winning an Oscar for a film; it frequently results in a favorable reception and success for your item. And this can be an opportunity for participants to repeat this process with different products. The secret to this method is storytelling, and knowing how to tell a compelling story is the only way to entice participants to do more research.

    This brings us to work two, where you incrementally examine a design or idea to see whether it addresses the problems.

    Act two: fight

    Act two is all about digging deeper into the problems that you identified in operate one. In order to evaluate a potential solution ( such as a design ), you typically conduct vertical research, such as usability tests, to see if it addresses the problems you identified. The issues may include unfulfilled needs or problems with a circulation or procedure that’s tripping users off. More issues may come up in the process, much like in action two of a movie. It’s here that you learn more about the figures as they grow and develop through this action.

    According to Jakob Nielsen, five users should be normally in usability tests, which means that this number of users can generally identify the majority of the issues:” You learn less and less as you add more and more users because you will keep seeing the same things over and over again… After the second user, you are wasting your time by constantly observing the same findings but no learning much new.”

    There are parallels with storytelling here too, if you try to tell a story with too many characters, the plot may get lost. With fewer participants, each user’s struggles will be more easily recalled and shared with other parties when discussing the research. This can help convey the issues that need to be addressed while also highlighting the value of doing the research in the first place.

    Usability tests have been conducted in person for tens of thousands of years, but remote testing can also be done using software like Microsoft Teams, Zoom, or other teleconferencing tools. This approach has become increasingly popular since the beginning of the pandemic, and it works well. You might interpret in-person usability tests as a form of theater watching as opposed to remote testing. There are advantages and disadvantages to each. Usability research in person is a much more valuable learning experience. Stakeholders can experience the sessions with other stakeholders. Additionally, you get real-time reactions, including surprises, disagreements, and discussions about what they’re seeing. Much like going to a play, where audiences get to take in the stage, the costumes, the lighting, and the actors ‘ interactions, in-person research lets you see users up close, including their body language, how they interact with the moderator, and how the scene is set up.

    If conducting usability testing in the field is like watching a play that is staged and controlled, where any two sessions may be very different from one another. You can take usability testing into the field by creating a replica of the space where users interact with the product and then conduct your research there. Or you can meet users at their location to conduct your research. With either option, you get to see how things work in context, things come up that wouldn’t have in a lab environment—and conversion can shift in entirely different directions. You have less control over how these sessions end as researchers, but this can occasionally help you understand users even better. Meeting users where they are can provide clues to the external forces that could be affecting how they use your product. Usability tests in person offer a level of detail that is frequently absent from remote testing.

    That’s not to say that the “movies” —remote sessions—aren’t a good option. A wider audience can be reached through remote sessions. They allow a lot more stakeholders to be involved in the research and to see what’s going on. Additionally, they make the doors accessible to a much wider range of users. But with any remote session there is the potential of time wasted if participants can’t log in or get their microphone working.

    You can ask real users questions to understand their thoughts and understanding of the solution as a result of usability testing, whether it is done remotely or in person. This can help you not only identify problems but also glean why they’re problems in the first place. Additionally, you can test your own hypotheses and determine whether your reasoning is correct. By the end of the sessions, you’ll have a much clearer picture of how usable the designs are and whether they work for their intended purposes. Act two is where the excitement is at the heart of the narrative, but there are also potential surprises. This is equally true of usability tests. Sometimes, participants will say unexpected things that alter the way you look at them, which can lead to unexpected turns in the story.

    Unfortunately, user research is sometimes seen as expendable. Usability testing is frequently the only method of research that some stakeholders believe they ever need, and it’s too frequently the case. In fact, if the designs that you’re evaluating in the usability test aren’t grounded in a solid understanding of your users ( foundational research ), there’s not much to be gained by doing usability testing in the first place. That’s because you’re narrowing down the area of focus on without considering the needs of the users. As a result, there’s no way of knowing whether the designs might solve a problem that users have. In the context of a usability test, it’s just feedback on a particular design.

    On the other hand, if you only do foundational research, while you might have set out to solve the right problem, you won’t know whether the thing that you’re building will actually solve that. This demonstrates the value of conducting both directional and foundational research.

    In act two, stakeholders will—hopefully—get to watch the story unfold in the user sessions, which creates the conflict and tension in the current design by surfacing their highs and lows. And in turn, this can encourage stakeholders to take action on the issues that arise.

    Act three: resolution

    The third act is about resolving the issues raised by the first two acts, whereas the first two are about comprehending the context and the tensions that can compel action. While it’s important to have an audience for the first two acts, it’s crucial that they stick around for the final act. That includes all members of the product team, including developers, UX experts, business analysts, delivery managers, product managers, and any other interested parties. It allows the whole team to hear users ‘ feedback together, ask questions, and discuss what’s possible within the project’s constraints. Additionally, it enables the UX design and research teams to clarify, suggest alternatives, or provide more context for their choices. So you can get everyone on the same page and get agreement on the way forward.

    This act is primarily told through voiceover with some audience participation. The researcher is the narrator, who paints a picture of the issues and what the future of the product could look like given the things that the team has learned. They provide the stakeholders with their suggestions and direction for developing this vision.

    Nancy Duarte in the Harvard Business Review offers an approach to structuring presentations that follow a persuasive story. The most effective presenters employ the same methods as great storytellers: they create a conflict that needs to be settled by reminding people of the status quo and then revealing a better way, according to Duarte. ” That tension helps them persuade the audience to adopt a new mindset or behave differently”.

    This type of structure aligns well with research results, and particularly results from usability tests. It provides proof for “what is “—the issues you’ve identified. And “what could be “—your recommendations on how to address them. And so forth and forth.

    You can reinforce your recommendations with examples of things that competitors are doing that could address these issues or with examples where competitors are gaining an edge. Or they can be visual, like quick sketches of how a new design could look that solves a problem. These can help generate conversation and momentum. And this continues until the session is over when you’ve concluded by bridging the gaps and offering suggestions for improvement. This is the part where you reiterate the main themes or problems and what they mean for the product—the denouement of the story. This stage provides stakeholders with the next steps, and hopefully, the motivation to take those steps as well!

    While we are nearly at the end of this story, let’s reflect on the idea that user research is storytelling. The three-act structure of user research contains all the components of a good story:

      Act one: You meet the protagonists ( the users ) and the antagonists ( the problems affecting users ). The plot begins here. In act one, researchers might use methods including contextual inquiry, ethnography, diary studies, surveys, and analytics. These techniques can produce personas, empathy maps, user journeys, and analytics dashboards as output.
      Act two: Next, there’s character development. The protagonists face problems and difficulties, which they must overcome, and there is conflict and tension. In act two, researchers might use methods including usability testing, competitive benchmarking, and heuristics evaluation. Usability findings reports, UX strategy documents, usability guidelines, and best practices can be included in the output of these.
      Act three: The protagonists triumph and you see what a better future looks like. Researchers may use techniques like storytelling, presentation decks, and digital media in act three. The output of these can be: presentation decks, video clips, audio clips, and pictures.

    The researcher plays a variety of roles, including producer, director, and storyteller. The participants have a small role, but they are significant characters ( in the research ). And the audience is one of the stakeholders. But the most important thing is to get the story right and to use storytelling to tell users ‘ stories through research. In the end, the parties should leave with a goal and an eagerness to fix the product’s flaws.

    So the next time that you’re planning research with clients or you’re speaking to stakeholders about research that you’ve done, think about how you can weave in some storytelling. User research is ultimately a win-win situation for everyone, and all you need to do is pique stakeholders ‘ interest in how the story ends.

  • To Ignite a Personalization Practice, Run this Prepersonalization Workshop

    To Ignite a Personalization Practice, Run this Prepersonalization Workshop

    This is in the photo. You’ve joined a club at your business that’s designing innovative product features with an focus on technology or AI. Or perhaps your business really implemented a customisation website. Either way, you’re designing with information. What then? When it comes to designing for personalization, there are many warning stories, no immediately achievement, and some guidelines for the baffled.

    The personalization space is true, between the dream of getting it right and the worry of it going wrong ( like when we encounter “persofails” similar to a company’s constant plea to regular people to purchase additional bathroom seats ). It’s an particularly confusing place to be a modern professional without a map, a map, or a strategy.

    There are no Lonely Planet and some tour guides for those of you who want to personalize because successful personalization depends so much on each group’s talent, technology, and market position.

    But you can ensure that your group has packed its carriers rationally.

    There’s a DIY method to increase your chances for victory. You’ll at least at least disarm your boss ‘ irrational exuberance. Before the group you’ll need to properly plan.

    It’s known as prepersonalization.

    Behind the audio

    Take into account the DJ have on Spotify, which was introduced last month.

    We’re used to seeing the polished final outcome of a personalization function. A personal have had to be developed, budgeted, and given priority before the year-end prize, the making-of-backstory, or the behind-the-scenes success chest. Before any customisation have goes live in your product or service, it lives amid a delay of valuable ideas for expressing consumer experiences more automatically.

    So how do you decide where to position your personalisation wagers? How do you design regular interactions that didn’t journey up users or—worse—breed mistrust? We’ve found that for many well-known budgeted programs to support their continued investments, they initially required one or more workshops to join vital technologies users and stakeholders. Make it matter.

    We’ve witnessed the same evolution up near with our clients, from big tech to budding companies. In our experience with working on small and large personalization work, a program’s best monitor record—and its capacity to weather tough questions, work steadily toward shared answers, and manage its design and engineering efforts—turns on how successfully these prepersonalization activities play out.

    Effective workshops consistently save time, money, and overall well-being by separating successful future endeavors from unsuccessful ones.

    A personalization practice involves a multiyear effort of testing and feature development. It’s not a tech stack switch-flip. It’s best managed as a backlog that often evolves through three steps:

    1. customer experience optimization ( CXO, also known as A/B testing or experimentation )
    2. always-on automations ( whether rules-based or machine-generated )
    3. mature features or standalone product development ( such as Spotify’s DJ experience )?

    This is why we created our progressive personalization framework and why we’re field-testing an accompanying deck of cards: we believe that there’s a base grammar, a set of “nouns and verbs” that your organization can use to design experiences that are customized, personalized, or automated. These cards won’t be necessary for you. But we strongly recommend that you create something similar, whether that might be digital or physical.

    Set the timer for your kitchen.

    How long does it take to cook up a prepersonalization workshop? The activities we suggest including during the assessment can ( and frequently do ) last for weeks. For the core workshop, we recommend aiming for two to three days. Here are a summary of our broad approach and information on the most crucial first-day activities.

    The full arc of the wider workshop is threefold:

      Kickstart: This specifies the terms of your engagement as you concentrate on both your team’s and your team’s readiness and drive.
    1. Plan your work: This is the heart of the card-based workshop activities where you specify a plan of attack and the scope of work.
    2. Work your plan: This stage consists of making it possible for team members to individually pitch their own pilots that each include a proof-of-concept project, business case, and operating model.

    Give yourself at least a day, split into two large time blocks, to power through a concentrated version of those first two phases.

    Kickstart: Apt your appetite

    We call the first lesson the “landscape of connected experience“. It looks at the possibilities for personalization at your company. A connected experience, in our parlance, is any UX requiring the orchestration of multiple systems of record on the backend. A marketing-automation platform and a content-management system could be used together. It could be a digital-asset manager combined with a customer-data platform.

    Give examples of connected experience interactions that you admire, find familiar, or even dislike, as examples of consumer and business-to-business examples. This should cover a representative range of personalization patterns, including automated app-based interactions ( such as onboarding sequences or wizards ), notifications, and recommenders. These cards contain a catalog, which we have. Here’s a list of 142 different interactions to jog your thinking.

    The table must be set up for this. What are the possible paths for the practice in your organization? Here’s a long-form primer and a strategic framework for a broader perspective.

    Assess each example that you discuss for its complexity and the level of effort that you estimate that it would take for your team to deliver that feature ( or something similar ). We categorize connected experiences in our cards according to their functions, features, experiences, complete products, and portfolios. Size your own build here. This will help to draw attention to the benefits of ongoing investment as well as the difference between what you deliver right now and what you want to deliver in the future.

    Next, have your team plot each idea on the following 2×2 grid, which lays out the four enduring arguments for a personalized experience. This is crucial because it emphasizes how personalization can affect your own ways of working as well as your external customers. It’s also a reminder ( which is why we used the word argument earlier ) of the broader effort beyond these tactical interventions.

    Each team member should decide where they would like to place your company’s emphasis on your product or service. Naturally, you can’t prioritize all of them. Here, the goal is to show how various departments may view their own benefits from the effort, which can vary from one department to the next. Documenting your desired outcomes lets you know how the team internally aligns across representatives from different departments or functional areas.

    The third and final Kickstart activity is about filling in the personalization gap. Is your customer journey well documented? Will data and privacy protection be a significant challenge? Do you have content metadata needs that you have to address? ( We’re pretty sure you do; it’s just a matter of acknowledging the magnitude of that need and finding a solution. ) In our cards, we’ve noted a number of program risks, including common team dispositions. For instance, our Detractor card lists six intractable stakeholder attitudes that prevent progress.

    Effectively collaborating and managing expectations is critical to your success. Consider the potential obstacles to your advancement in the future. Press the participants to name specific steps to overcome or mitigate those barriers in your organization. According to research, personalization initiatives face a number of common obstacles.

    At this point, you’ve hopefully discussed sample interactions, emphasized a key area of benefit, and flagged key gaps? Good, you’re ready to go on.

    Hit that test kitchen

    What will you need next to bring your personalized recipes to life. Personalization engines, which are robust software suites for automating and expressing dynamic content, can intimidate new customers. Their capabilities are broad and potent, and they give you a variety of ways to organize your company. This presents the question: Where do you begin when you’re configuring a connected experience?

    The key here is to avoid treating the installed software ( as one of our client executives humorously put it ) like some sort of dream kitchen. These software engines are more like test kitchens where your team can begin devising, tasting, and refining the snacks and meals that will become a part of your personalization program’s regularly evolving menu.

    Over the course of the workshop, the final menu of the prioritized backlog will be created. And creating “dishes” is the way that you’ll have individual team stakeholders construct personalized interactions that serve their needs or the needs of others.

    The dishes will be made from recipes, which have predetermined ingredients.

    Verify your ingredients

    You’ll ensure that you have everything you need to create your desired interaction ( or that you can determine what needs to be added to your pantry like a good product manager ) and that you have validated with the right stakeholders present. These ingredients include the audience that you’re targeting, content and design elements, the context for the interaction, and your measure for how it’ll come together.

    Not just discovering requirements, it is. Documenting your personalizations as a series of if-then statements lets the team:

    1. compare findings to a unified approach for developing features, similar to how artists paint with the same color palette,
    2. specify a consistent set of interactions that users find uniform or familiar,
    3. and establish parity between all important performance indicators and performance metrics.

    This helps you streamline your designs and your technical efforts while you deliver a shared palette of core motifs of your personalized or automated experience.

    Create your recipe.

    What ingredients are important to you? Consider the construct “what-what-when-why”

    • Who are your key audience segments or groups?
    • What kind of content will you provide for them, what design elements, and under what circumstances?
    • And for which business and user benefits?

    Five years ago, we created these cards and card categories. We regularly play-test their fit with conference audiences and clients. And there are still fresh possibilities. But they all follow an underlying who-what-when-why logic.

    In the cards in the accompanying photo below, you can typically follow along with right to left in three examples of subscription-based reading apps.

    1. Nurture personalization: When a guest or an unknown visitor interacts with a product title, a banner or alert bar appears that makes it easier for them to encounter a related title they may want to read, saving them time.
    2. Welcome automation: An email is sent to a newly registered user to highlight the breadth of the content catalog and convert them to happy subscribers.
    3. Winback automation: Before their subscription lapses or after a recent failed renewal, a user is sent an email that gives them a promotional offer to suggest that they reconsider renewing or to remind them to renew.

    We’ve also found that cocreating the recipes themselves can sometimes be the most effective way to start brainstorming about what these cards might be for your organization. Start with a set of blank cards, and begin labeling and grouping them through the design process, eventually distilling them to a refined subset of highly useful candidate cards.

    The workshop’s later stages could be characterized as shifting from focusing on a cookbook to a more nuanced customer-journey mapping. Individual” cooks” will pitch their recipes to the team, using a common jobs-to-be-done format so that measurability and results are baked in, and from there, the resulting collection will be prioritized for finished design and delivery to production.

    Architecture must be improved to produce better kitchens.

    Simplifying a customer experience is a complicated effort for those who are inside delivering it. Beware of anyone who contradicts your advice. With that being said,” Complicated problems can be hard to solve, but they are addressable with rules and recipes“.

    When a team overfits: they aren’t designing with their best data, personalization turns into a laughing line. Like a sparse pantry, every organization has metadata debt to go along with its technical debt, and this creates a drag on personalization effectiveness. For instance, your AI’s output quality is in fact impacted by your IA. Spotify’s poster-child prowess today was unfathomable before they acquired a seemingly modest metadata startup that now powers its underlying information architecture.

    You can’t stand the heat, unquestionably…

    Personalization technology opens a doorway into a confounding ocean of possible designs. Only a disciplined and highly collaborative approach can achieve the necessary concentration and intention. So banish the dream kitchen. Instead, head to the test kitchen to save time, preserve job security, and avoid imagining the creative concepts that come from your organization’s masters. There are meals to serve and mouths to feed.

    This organizational framework gives you a fighting chance at long-term success as well as solid ground. Wiring up your information layer isn’t an overnight affair. However, if you use the same cookbook and the same recipe combination, you’ll have solid ground for success. We designed these activities to make your organization’s needs concrete and clear, long before the hazards pile up.

    Although there are associated costs associated with purchasing this kind of technology and product design, your time well spent is on sizing up and confronting your unique situation and digital skills. Don’t squander it. The pudding is the proof, as they say.

  • The Wax and the Wane of the Web

    The Wax and the Wane of the Web

    When you begin to believe you have everything figured out, everyone does change, in my opinion. Simply as you start to get the hang of injections, diapers, and ordinary sleep, it’s time for solid foods, potty training, and nighttime sleep. When those are determined, school and occasional sleeps are in order. The cycle goes on and on.

    The same holds true for those of us who are currently employed in design and development. Having worked on the web for about three years at this point, I’ve seen the typical wax and wane of concepts, strategies, and systems. Every day we as developers and designers re-enter the familiar pattern, a brand-new systems or idea emerges to shake things up and completely alter the world.

    How we got below

    I built my first website in the mid-’90s. Design and development on the web back then was a free-for-all, with few established norms. For any layout aside from a single column, we used table elements, often with empty cells containing a single pixel spacer GIF to add empty space. We styled text with numerous font tags, nesting the tags every time we wanted to vary the font style. And we had only three or four typefaces to choose from: Arial, Courier, or Times New Roman. When Verdana and Georgia came out in 1996, we rejoiced because our options had nearly doubled. The only safe colors to choose from were the 216 “web safe” colors known to work across platforms. The few interactive elements (like contact forms, guest books, and counters) were mostly powered by CGI scripts (predominantly written in Perl at the time). Achieving any kind of unique look involved a pile of hacks all the way down. Interaction was often limited to specific pages in a site.

    the development of online requirements

    At the turn of the century, a new cycle started. Crufty code littered with table layouts and font tags waned, and a push for web standards waxed. Newer technologies like CSS got more widespread adoption by browsers makers, developers, and designers. This shift toward standards didn’t happen accidentally or overnight. It took active engagement between the W3C and browser vendors and heavy evangelism from folks like the Web Standards Project to build standards. A List Apart and books like Designing with Web Standards by Jeffrey Zeldman played key roles in teaching developers and designers why standards are important, how to implement them, and how to sell them to their organizations. And approaches like progressive enhancement introduced the idea that content should be available for all browsers—with additional enhancements available for more advanced browsers. Meanwhile, sites like the CSS Zen Garden showcased just how powerful and versatile CSS can be when combined with a solid semantic HTML structure.

    Server-side language like PHP, Java, and.NET took Perl as the primary back-end computers, and the cgi-bin was tossed in the garbage bin. With these improved server-side software, the first period of internet programs started with content-management techniques (especially those used in blogs like Blogger, Grey Matter, Movable Type, and WordPress ) In the mid-2000s, AJAX opened gates for sequential interaction between the front end and back close. Pages was now revise their content without having to reload it. A grain of Script frameworks like Prototype, YUI, and ruby arose to aid developers develop more credible client-side conversation across browsers that had wildly varying levels of standards support. Techniques like photo alternative enable skilled manufacturers and developers to use fonts of their choosing. And technology like Flash made it possible to include movies, sports, and even more engagement.

    These new methods, requirements, and solutions greatly boosted the sector’s growth. Web style flourished as creators and designers explored more different styles and designs. However, we also relied heavily on tricks. Early CSS was a huge improvement over table-based layouts when it came to basic layout and text styling, but its limitations at the time meant that designers and developers still relied heavily on images for complex shapes ( such as rounded or angled corners ) and tiled backgrounds for the appearance of full-length columns (among other hacks ). All kinds of nested floats or absolute positioning ( or both ) were necessary for complicated layouts. Display and photo substitute for specialty styles was a great start toward varying the designs from the big five, but both tricks introduced convenience and efficiency issues. Additionally, JavaScript libraries made it simple to add a dash of interaction to pages without having to spend the money to double or even quadruple the download size for basic websites.

    The web as software platform

    The interplay between the front end and the back end continued to grow, which led to the development of the current era of modern web applications. Between expanded server-side programming languages ( which kept growing to include Ruby, Python, Go, and others ) and newer front-end tools like React, Vue, and Angular, we could build fully capable software on the web. Along with these tools, there were additional options, such as shared package libraries, build automation, and collaborative version control. What was once primarily an environment for linked documents became a realm of infinite possibilities.

    Mobile devices increased in their capabilities as well, and they gave us access to the internet while we were traveling. Mobile apps and responsive design opened up opportunities for new interactions anywhere and any time.

    This fusion of potent mobile devices and potent development tools contributed to the growth of social media and other centralized tools for people to use and interact with. As it became easier and more common to connect with others directly on Twitter, Facebook, and even Slack, the desire for hosted personal sites waned. Social media provided connections on a global scale, with both positive and negative outcomes.

    Want a much more extensive history of how we got here, with some other takes on ways that we can improve? ” Of Time and the Web” was written by Jeremy Keith. Or check out the” Web Design History Timeline” at the Web Design Museum. Additionally, Neal Agarwal takes a fascinating tour of” Internet Artifacts.”

    Where we are now

    It seems like we’ve reached yet another significant turning point in recent years. As social-media platforms fracture and wane, there’s been a growing interest in owning our own content again. From the tried-and-true classic of hosting plain HTML files to static site generators and content management systems of all kinds, there are many different ways to create websites. The fracturing of social media also comes with a cost: we lose crucial infrastructure for discovery and connection. Webmentions, RSS, ActivityPub, and other IndieWeb tools can be useful in this regard, but they’re still largely underdeveloped and difficult to use for the less geeky. We can build amazing personal websites and add to them regularly, but without discovery and connection, it can sometimes feel like we may as well be shouting into the void.

    Especially with efforts like Interop, browser support for CSS, JavaScript, and other standards like web components has increased. New technologies gain support across the board in a fraction of the time that they used to. When I first learn about a new feature, I frequently discover that its coverage is already over 80 % when I check the browser support. Nowadays, the barrier to using newer techniques often isn’t browser support but simply the limits of how quickly designers and developers can learn what’s available and how to adopt it.

    We can now prototype almost any idea with just a few commands and a few lines of code. All the tools that we now have available make it easier than ever to start something new. However, as the initial cost of these frameworks may be saved in the beginning, it eventually becomes due as their upkeep and maintenance becomes a component of our technical debt.

    If we rely on third-party frameworks, adopting new standards can sometimes take longer since we may have to wait for those frameworks to adopt those standards. These frameworks, which previously made it easier to adopt new techniques sooner, have since evolved into obstacles. These same frameworks often come with performance costs too, forcing users to wait for scripts to load before they can read or interact with pages. And when scripts fail ( whether due to poor code, network issues, or other environmental factors ), there is frequently no other option, leaving users with blank or broken pages.

    Where do we go from here?

    Hacks of today help to shape standards for the future. And there’s nothing inherently wrong with embracing hacks —for now—to move the present forward. Problems only arise when we refuse to acknowledge that they are hacks or when we choose not to replace them. So what can we do to create the future we want for the web?

    Build for the long haul. Optimize for performance, for accessibility, and for the user. weigh the costs associated with those user-friendly tools. They may make your job a little easier today, but how do they affect everything else? What is the cost to the users? To future developers? To adoption of standards? Sometimes the convenience may be worth it. Sometimes it’s just a hack that you’ve gotten used to. And sometimes it’s holding you back from even better options.

    Start with standards. Standards continue to evolve over time, but browsers have done a remarkably good job of continuing to support older standards. The same isn’t always the case with third-party frameworks. Sites built with even the hackiest of HTML from the’ 90s still work just fine today. Even after a few years, the same can’t be said about websites created with frameworks.

    Design with care. Consider the effects of each choice, whether it is your craft, which is code, pixels, or processes. The convenience of many a modern tool comes at the cost of not always understanding the underlying decisions that have led to its design and not always considering the impact that those decisions can have. Use the time saved by modern tools to consider more carefully and design with consideration rather than rush to “move fast and break things”

    Always be learning. If you’re constantly learning, you’re also developing. Sometimes it may be hard to pinpoint what’s worth learning and what’s just today’s hack. Even if you were to concentrate solely on learning standards, you might end up focusing on something that won’t matter next year. ( Remember XHTML? ) However, ongoing learning opens up new connections in your brain, and the techniques you learn in one day may be used to guide different experiments in the future.

    Play, experiment, and be weird! The ultimate experiment is this web that we’ve created. It’s the single largest human endeavor in history, and yet each of us can create our own pocket within it. Be brave and make new friends. Build a playground for ideas. In your own bizarre science lab, perform bizarre experiments. Start your own small business. There has never been a place where we have more room to be creative, take risks, and discover our potential.

    Share and amplify. As you play, experiment, and learn, share what has worked for you. Write on your own website, post on whichever social media site you prefer, or shout it from a TikTok. Write something for A List Apart! But take the time to amplify others too: find new voices, learn from them, and share what they’ve taught you.

    Make a move and make it happen.

    As designers and developers for the web ( and beyond ), we’re responsible for building the future every day, whether that may take the shape of personal websites, social media tools used by billions, or anything in between. Let’s give everything we produce a positive vibe by infusing our values into everything we do. Create that thing that only you are uniquely qualified to make. Then distribute it, improve it, re-use it, or create something new with it. Learn. Make. Share. Grow. Rinse and repeat. Everything will change whenever you believe you have the ability to use the internet.

  • Opportunities for AI in Accessibility

    Opportunities for AI in Accessibility

    I thoroughly enjoyed reading Joe Dolson’s most recent article on the crossroads of AI and mobility because of how skeptical he is of AI in general and how many people have been using it. In fact, I’m very skeptical of AI myself, despite my role at Microsoft as an accessibility technology strategist who helps manage the AI for Accessibility award program. AI can be used in quite creative, inclusive, and accessible ways, as well as in harmful, exclusive, and harmful ways, like with any tool. Additionally, there are a lot of uses in the subpar midsection.

    I’d like you to consider this a “yes … and” piece to complement Joe’s post. Instead of refuting everything he’s saying, I’m pointing out some areas where AI may produce real, positive impacts on people with disabilities. To be clear, I’m not saying that there aren’t real challenges or pressing problems with AI that need to be addressed; there are, and we’ve needed to address them, like, yesterday; instead, I want to take a moment to talk about what’s possible so that we can get it one day.

    Other words

    Joe’s article spends a lot of time addressing computer-vision types ‘ ability to create other words. He raises a number of legitimate points about the state of affairs right now. And while computer-vision concepts continue to improve in the quality and complexity of information in their information, their benefits aren’t wonderful. He argues to be accurate that the state of image research is currently very poor, especially for some image types, in large part due to the absence of contextual contexts in which to look at images ( as a result of having separate “foundation” models for words analysis and image analysis ). Today’s models aren’t trained to distinguish between images that are contextually relevant ( should probably have descriptions ) and those that are purely decorative ( couldn’t possibly need a description ) either. However, I still think there’s possible in this area.

    As Joe points out, far text editing via human-in-the-loop should be a given. And if AI can intervene and provide a starting point for alt text, even if the rapid reads,” What is this BS?” That’s not correct at all … Let me try to offer a starting point— I think that’s a win.

    If we can specifically teach a design to consider image usage in context, it might be able to help us more swiftly distinguish between images that are likely to be beautiful and those that are more descriptive. That will help clarify which situations require image descriptions, and it will increase authors ‘ effectiveness in making their sites more visible.

    While complex images—like graphs and charts—are challenging to describe in any sort of succinct way ( even for humans ), the image example shared in the GPT4 announcement points to an interesting opportunity as well. Let’s say you came across a map that merely stated the chart’s name and the type of representation it was:” Pie chart comparing smartphone use to have phone usage in US households making under$ 30, 000 annually.” ( That would be a pretty bad alt text for a chart because it would frequently leave many unanswered questions about the data, but let’s just assume that that was the description in place. ) If your website knew that that picture was a pie graph ( because an ship model concluded this ), imagine a world where people could ask questions like these about the creative:

    • Perform more people use have apps or smartphones?
    • How many more?
    • Is there a group of people that don’t fall into either of these containers?
    • What number is that?

    For a moment, the chance to learn more about graphics and data in this way could be innovative for people who are blind and low vision as well as for those with various types of color blindness, cognitive impairments, and other issues. Putting aside the challenges of large language model ( LLM) hallucinations. It could also be useful in educational contexts to help people who can see these charts, as is, to understand the data in the charts.

    What if you could ask your browser to make a complicated chart simpler? What if you asked it to separate a single line from a line graph? What if you could ask your browser to transpose the colors of the different lines to work better for form of color blindness you have? What if you could ask it to switch colors for patterns? That seems like a possibility given the chat-based interfaces and our current ability to manipulate images in the AI tools of today.

    Now imagine a purpose-built model that could extract the information from that chart and convert it to another format. Perhaps it could convert that pie chart (or, better yet, a series of pie charts ) into more usable ( and useful ) formats, like spreadsheets, for instance. That would be incredible!

    Matching algorithms

    When Safiya Umoja Noble chose to call her book Algorithms of Oppression, she hit the nail on the head. Although her book focused on how search engines can foster racism, I believe it’s equally true that all computer models have the potential to foster conflict, prejudice, and intolerance. Whether it’s Twitter always showing you the latest tweet from a bored billionaire, YouTube sending us into a Q-hole, or Instagram warping our ideas of what natural bodies look like, we know that poorly authored and maintained algorithms are incredibly harmful. A large portion of this is a result of a lack of diversity in the people who design and construct them. There is still a lot of potential for algorithm development when these platforms are built with inclusive features in mind.

    Take Mentra, for example. They serve as a network of people with disabilities. They employ an algorithm to match job seekers with potential employers based on more than 75 data points. On the job-seeker side of things, it considers each candidate’s strengths, their necessary and preferred workplace accommodations, environmental sensitivities, and so on. On the employer side, it takes into account each work environment, communication issues relating to each job, and other factors. Mentra made the decision to change the script when it came to typical employment websites because it was run by neurodivergent people. They use their algorithm to propose available candidates to companies, who can then connect with job seekers that they are interested in, reducing the emotional and physical labor on the job-seeker side of things.

    When more people with disabilities are involved in developing algorithms, this can lower the likelihood that these algorithms will harm their communities. That’s why diverse teams are so crucial.

    Imagine that a social media company’s recommendation engine was tuned to analyze who you’re following and if it was tuned to prioritize follow recommendations for people who talked about similar things but who were different in some key ways from your existing sphere of influence. For instance, if you follow a group of white men who are not white or aren’t white and who also discuss AI, it might be wise to follow those who are also disabled or who are not white. If you followed its recommendations, you might learn more about what’s happening in the AI field. These same systems should also use their understanding of biases about particular communities—including, for instance, the disability community—to make sure that they aren’t recommending any of their users follow accounts that perpetuate biases against (or, worse, spewing hate toward ) those groups.

    Other ways that AI can assist people with disabilities

    I’m sure I could go on and on about using AI to assist people with disabilities, but I’m going to make this last section into a bit of a lightning round. In no particular order:

      Voice preservation You may be aware of the voice-prescribing options from Microsoft, Acapela, or others, or you may have seen the announcement for VALL-E or Apple’s Global Accessibility Awareness Day. It’s possible to train an AI model to replicate your voice, which can be a tremendous boon for people who have ALS ( Lou Gehrig’s disease ) or motor-neuron disease or other medical conditions that can lead to an inability to talk. We need to approach this tech responsibly because it has the potential to have a truly transformative impact, which is why it can also be used to create audio deepfakes.
    • voice recognition Researchers like those in the Speech Accessibility Project are paying people with disabilities for their help in collecting recordings of people with atypical speech. As I type, they are actively seeking out people who have Parkinson’s and related conditions, and they intend to expand this list as the project develops. More people with disabilities will be able to use voice assistants, dictation software, and voice-response services as a result of this research, which will lead to more inclusive data sets that enable them to use their computers and other devices more effectively and with just their voices.
    • Text transformation. The most recent generation of LLMs is quite capable of changing existing text without giving off hallucinations. This is incredibly empowering for those who have cognitive disabilities and who may benefit from text summaries or simplified versions, or even text that has been prepared for Bionic Reading.

    The importance of diverse teams and data

    Our differences must be acknowledged as important. The intersections of the identities we exist in have an impact on our lived experiences. These lived experiences—with all their complexities ( and joys and pain ) —are valuable inputs to the software, services, and societies that we shape. Our differences must be reflected in the data we use to develop new models, and those who provide that valuable information must be compensated for doing so. Inclusive data sets produce stronger models that promote more justifiable outcomes.

    Want a model that doesn’t demean or patronize or objectify people with disabilities? Make sure that you include information about disabilities that is written by people who have a range of disabilities and that is well represented in the training data.

    Want a non-binary language model? You may be able to use existing data sets to build a filter that can intercept and remediate ableist language before it reaches readers. Despite this, AI models won’t soon replace human copy editors when it comes to sensitivity reading.

    Want a copilot for coding that provides recommendations that are accessible after the jump? Train it on code that you know to be accessible.


    I have no doubt that AI has the potential to harm people today, tomorrow, and long into the future. However, I also think we should acknowledge this and make thoughtful, thoughtful, and intentional changes to our approaches to AI that will reduce harm over time as well. Today, tomorrow, and well into the future.


    Many thanks to Kartik Sawhney for supporting the development of this article, Ashley Bischoff for providing me with invaluable editorial support, and, of course, Joe Dolson for the prompt.

  • I am a creative.

    I am a creative.

    I am imaginative. What I do is alchemy. It is a puzzle. I don’t perform it as much as I let it be done by me.

    I have a creative side. This tag is not appropriate for all creatives. No everyone see themselves in this manner. Some innovative people practice scientific in their work. I value their assertion, which is true. Perhaps I also have a small envy for them. However, my method is unique; my being is unique.

    It distracts you to apologize and qualify in progress. That’s what my head does to destroy me. I put it off for the moment. I may come back later to make amends and count. after I’ve said what I should have. which is sufficient.

    Except when it flows like a wine valley and is simple.

    Sometimes it does. Maybe what I need to make arrives in a flash. When I say something at that time, I’ve learned not to say it because people often don’t work hard enough to acknowledge that the idea is the best idea even when you know it’s the best idea.

    Sometimes I just work until the thought strikes me. Maybe it arrives right away, but I don’t remind people for three days. Sometimes I get so excited about an idea that just came along that I blurt it out and didn’t stop myself. like a child who discovered a prize in a box of Cracker Jacks. Maybe I get away with this. Yes, that is the best idea, but maybe others disagree. They don’t usually, and I regret losing my joy.

    Passion should be saved for the meeting, where it will matter. not the informal gathering that two different gatherings precede that meeting. Nothing understands why we hold these gatherings. We keep saying we’re going to get rid of them, but we just keep trying to find different ways to get them. They occasionally also are good. But occasionally they detract from the real job. Depending on what you do and where you do it, the ratio between when conferences are valuable and when they are a sad distraction vary. And who you are and how you go about doing it. I’ll go back and forth once more. I am imaginative. That is the topic.

    Often, a lot of hours of diligent and diligent work ends up with something that is rarely useful. Maybe I have to take that and move on to the next task.

    Don’t inquire about the procedure. I am imaginative.

    I have a creative side. I have no power over my goals. And I have no control over my best tips.

    I can chisel aside, surround myself with information or photos, and occasionally that works. I can go for a move, which occasionally works. There is no connection between sizzling fuel and flowing pots, and I may be making dinner. I frequently know what to do when I awaken. The idea that may have saved me disappears almost as frequently as I become aware and part of the world once more in a mindless breeze of oblivion. For imagination, in my opinion, comes from that other planet. The one that we enter in goals, and possibly before and after death. But authors should be asking this, and I am not one of them. I am imaginative. Theologians are encouraged to build massive armies in their artistic globe, which they insist is real. But that is yet another diversion. And it’s sad. Whether or not I am innovative or not, this may be on a much larger issue. But that’s not how I came around, though.

    Often the outcome is evasion. And suffering. Do you know the actor who is tortured by the cliché? Even when the artist attempts to create a soft drink song, a callback in a worn-out sitcom, or a budget request, that noun is correct.

    Some individuals who detest the idea of being called artistic perhaps been closeted artists, but that’s between them and their gods. No act here. Your assertions are also accurate. However, mine is for me.

    Designers acknowledge their work.

    Disadvantages are aware of cons, just like queers are aware of queers, just like real rappers are aware of genuine rappers. Designers are highly revered by people in the world. We revere, follow, and almost deify the great types. Of course, it is dreadful to revere any person. We’ve been given a warning. We are more knowledgeable. We are aware that people are really people. Because they are clay, like us, they squabble, they are depressed, they regret making the most important decisions, they are weak and hungry, they can be cruel, and they can be as ridiculous as we can. But. But. However, they produce something incredible. They give birth to something that may not occur without them and did not exist before them. They are the inspirations ‘ mother. And I suppose I should add that they are the mother of technology because it’s just lying it. Ba ree backside! That’s done, I suppose. Continue.

    Because we compare our personal small accomplishments to those of the great ones, designers denigrate our own. Wonderful video I‘m not Miyazaki, so I‘m not. That is glory right now. That is glory straight out of the mouth of God. This meagre much creation that I made? It essentially fell off the back of the pumpkin vehicle. And the carrots weren’t actually new.

    Artists is aware that they are at best Some. That is what Mozart’s artists do, also.

    I have a creative side. I haven’t worked in advertising in 30 times, but my former artistic managers have been the ones who make my decisions. They are correct to do that. My mind goes blank when it really counts because I’m too stupid and complacent. No medication is available to treat artistic function.

    I have a creative side. Every project I create has a goal that makes Indiana Jones appear older and snoring in a deck head. The more I pursue my creative endeavors, the faster I progress in my work, and the more I slog through lines and gaze blankly before beginning that task.

    I can move ten times more quickly than those who aren’t innovative, those who have only had a short-cut of creativity, and those who have just had a short-cut of creativity for work. Only that I spend twice as long as they do putting the job off before I work ten times as quickly as they do. When I put my mind to it, I am so confident in my ability to do a fantastic task. I have an addiction to the delay hurry. The leap also terrifies me.

    I am hardly a painter.

    I have a creative side. never a musician. Though as a child, I had a dream that I would one day become that. Some of us criticize our abilities and like our own accomplishments because we are not Michelangelos and Warhols. That is narcissism, but at least we aren’t in elections.

    I have a creative side. Despite my belief in reason and science, my decisions are based on my own senses. and sit in the aftermath of both the successes and disasters.

    I have a creative side. Every term I’ve said these may irritate another artists who have different viewpoints. Ask a question to two artists, and you’ll find three responses. No matter how we perhaps think about it, our debate, our passion for it, and our responsibility to our own truth, at least in my opinion, are the best indications that we are artists.

    I have a creative side. I lament my lack of taste in almost all of the areas of human understanding, which I know very little about. And I put my preference before all other things in the areas that are most dear to my soul, or perhaps more precisely, to my passions. Without my passions, I’d probably have to spend the majority of our time looking ourselves in the eye, which is something that almost none of us can do for very much. No seriously. Actually, no. Because so much in existence is intolerable if you really look at it.

    I have a creative side. I think that when I am gone, some of the good parts of me will stay in the head of at least one additional person, just like a family does.

    Working frees me from worrying about my job.

    I have a creative side. I fear that my little present will disappear.

    I have a creative side. I’m too busy making the next thing to devote too much time to it, especially since practically everything I create did achieve the level of success I conceive of.

    I have a creative side. I think there is the greatest secret in the process. I think so strongly that I am actually foolish enough to post an essay I wrote into a tiny machine without having to go through or edit it. I swear I didn’t do this frequently. But I did it right away because I was even more scared of forgetting what I was saying because I was as scared as I might be of you seeing through my sad gestures toward the gorgeous.

    There. I believe I’ve said it.

  • From Beta to Bedrock: Build Products that Stick.

    From Beta to Bedrock: Build Products that Stick.

    I’ve lost count of the times when promising ideas go from being useless in a few days to being useless after working as a solution designer for too long to explain.

    Financial goods, which is the industry in which I work, are no exception. It’s tempting to put as many features at the ceiling as possible and hope someone sticks because people’s true, hard-earned money is on the line, user expectations are high, and a crammed market. However, this strategy is a formula for disaster. Why, please:

    The drawbacks of feature-first creation

    It’s simple to get swept up in the enthusiasm of developing innovative features when you start developing a financial product from scratch or are migrating existing client journeys from papers or telephony channels to online bank or mobile apps. They may believe,” If I may only add one more thing that solves this particular person problem, they’ll enjoy me”! But what happens if you eventually encounter a roadblock as a result of your security team’s negligence? not like it? When a battle-tested film isn’t as well-known as you anticipated or when it fails due to unforeseen difficulty?

    The concept of Minimum Viable Product ( MVP ) comes into play in this context. Even though Jason Fried doesn’t usually refer to it that way, his podcast Rework and his book Getting Real frequently address this concept. An MVP is a product that offers only sufficient value to your users to keep them interested, but not so much that it becomes difficult to keep up. Although the idea seems simple, it requires a razor-sharp eye, a ruthless edge, and the courage to stand up for your position because it is easy to fall for” the Columbo Effect” when there is always” just one more thing …” to add.

    The issue with most fund apps is that they frequently turn out to be reflections of the company’s internal politics rather than an experience created purely for the customer. This implies that the priority should be given to delivering as many features and functionalities as possible in order to satisfy the requirements and wishes of competing internal departments as opposed to crafting a compelling value statement that is focused on what people in the real world actually want. As a result, these products can very quickly became a mixed bag of misleading, related, and finally unhappy customer experiences—a feature salad, you might say.

    The significance of the foundation

    What’s a better course of action then? How may we create products that are user-friendly, firm, and, most importantly, stick?

    The concept of “bedrock” comes into play in this context. The main component of your item that really matters to people is Bedrock. It’s the fundamental building block that creates benefit and maintains relevance over time.

    The rock has got to be in and around the standard servicing journeys in the world of retail bank, which is where I work. People only look at their existing account once every blue sky, but they do so daily. They purchase a credit card every year or two, but they at least once a month assess their stability and pay their bills.

    The key is in identifying the main tasks that individuals want to complete and therefore relentlessly striving to make them simple, reliable, and trustworthy.

    How can you reach the foundation, though? By focusing on the” MVP” strategy, giving clarity the top priority, and working toward a distinct value proposition. This means avoiding pointless extras and putting your customers first, making the most of them.

    It also requires some nerve, as your coworkers might not always agree on your eyesight right away. And in some cases, it might even mean making it clear to consumers that you won’t be coming over to their home to prepare their meal. Sometimes you may need to use the sporadic “opinionated user interface design” ( i .e. clunky workaround for edge cases ) to test a concept or to give yourself some room to work on something more crucial stuff.

    Realistic methods for creating financially successful products

    What are the main learnings I’ve made from my own research and practice, then?

    1. What trouble are you trying to solve first, and make a distinct “why”? Who is it for? Make sure your goal is unmistakable before beginning any work. Make certain it also aligns with the goals of your business.
    2. Avoid putting too many features on the list at after; instead, focus on getting that right first. Choose one that actually adds price, and work from that.
    3. When it comes to financial goods, clarity is often over difficulty. Eliminate unwanted details and concentrate on what matters most.
    4. Accept constant iteration as Bedrock is a powerful process rather than a set destination. Continuously collect customer feedback, make product improvements, and advance in that direction.
    5. Stop, glance, and talk: You must test your product frequently in the field rather than just as part of the shipping process. Use it for yourself. Work A/B tests. User opinions on Gear. Speak to users and make adjustments accordingly.

    The “bedrock dilemma”

    This is an intriguing conundrum: sacrificing some of the potential for short-term progress in favor of long-term stability. But the reward is worthwhile because products created with a concentrate on core will outlive and outperform their competitors and provide people with ongoing value over time.

    How do you begin your quest for rock, then? Consider it gradually. Start by identifying the underlying factors that your customers actually care about. Focus on developing and improving a second, potent function that delivers real value. And most importantly, make an obsessive effort because, whatever you think, Abraham Lincoln, Alan Kay, or Peter Drucker, you can’t deny it! The best way to foretell the future is to make it, he said.

  • Stranger Things: The First Shadow Teases Season 5 Secrets

    Stranger Things: The First Shadow Teases Season 5 Secrets

    A famous character of the stage once remarked there are more things in heaven and earth than are dreamt of in your philosophy. It’s a truism which holds for our world, as well as that of Hawkins, Indiana. Sure, Lucas, Dustin, Eleven, and the rest of the gang might have faced the Demogorgon in the […]

    The post Stranger Things: The First Shadow Teases Season 5 Secrets appeared first on Den of Geek.

    This article contains full spoilers for every Final Destination movie, INCLUDING Bloodlines.

    For more than a decade, we thought we’d finally made it. It’s been 14 years since the last Final Destination film, the last time Death started killing off those who escaped its plan in exceedingly gruesome fashion. We thought we were free to go to theaters in safety once more. But as the mortician William Bludworth, played by the late great Tony Todd, has taught us, there’s no escaping Death.

    The franchise is back with one of its best entries: Final Destination Bloodlines, written and directed by newcomers to the franchise Zach Lipovsky and Adam Stein. Bloodlines has a shinier look and a different approach, focusing on a family instead a group of random teens. But it follows the well-established principles of a Final Destination movie, especially in its incredible kills.

    In celebration of Bloodlines bringing Final Destination back to screens, we’re ranking all of Death’s achievements across the franchise. Because Final Destination movies are ultimately about good, gory fun, we’re ranking them from the most boring to the most enjoyably incredible.

    Like Death itself, we do have a few rules here. We aren’t counting any deaths in the premonitions that open each movie, nor the mass casualties that occur in the actual events, which means that you won’t see the infamous pile-up from Final Destination 2 or the incredible tower sequence that opens Bloodlines. Also we’re focusing on Death’s kills, so kills done by human beings don’t count. Even with those restrictions, Final Destination gives us plenty of memorable kills, as Death always makes a show of getting even.

    40. Alex Browning’s Off-Screen Demise (Final Destination 2)

    Is it a mark of respect that the first movie’s protagonist Alex Browning (Devon Sawa) doesn’t die on screen? Or is it the ultimate insult that we learn via newspaper clipping in Final Destination 2 that he was knocked in the head with a brick? Interpretations may vary, but no one can disagree that Alex’s death deserves the bottom spot.

    39. Dennis Lapman Gets a Wrenching Headache (Final Destination 5)

    Played by comedy great David Koechner, paper plant boss Dennis Lapman of Final Destination 5 has one of the gnarliest premonition deaths. Dangling off a collapsing bridge, Dennis almost pulls himself back up when he’s doused with hot tar, burning alive as he lets go and drops to the water. That incredible end makes his actual expiration all the worse, as he goes out when a loose wrench on a shop floor gets hurled into his head, no real setup involved.

    38. Wendy Cristensen, Julie Cristensen, and Kevin Fischer Crash Off-Screen (Final Destination 3)

    With the exception of the original Final Destination, the protagonists end their films thinking they’ve beaten Death only to realize that the Grim Reaper has one more trick up his sleeve, and the movies end with shocking cuts. The worst of them comes in Final Destination 3, one of the weaker entries overall, in which Wendy Cristensen (Mary Elizabeth Winstead), her sister Julie (Amanda Crew), and pal Kevin Fischer (Ryan Merriman) all perish in a train crash.

    Technically we see them meet their end in impressive carnage, but that all happens in a premonition, which this list rules out. So we have to go with the death that happens onscreen—well, on soundtrack, as the movie cuts to black with the sound of the crash.

    37. Janet Cunningham, Lori Milligan, Nick O’Bannon Death By X-Ray Truck (The Final Destination)

    Easily the worst of the series, the fourth entry The Final Destination also ends with a sudden attack on the protagonists. In this case, Nick O’Bannon (Bobby Campo), his love interest Lori Milligan (Shantel VanSanten), and her friend Janet Cunningham (Haley Webb) meet in a coffee shop to celebrate life, only for a truck to crash into the building. It’s a lot like the third movie’s ending, but at least this movie gives us neat x-rays to look at and imagine what horrible things happened to our heroes.

    cnx.cmd.push(function() {
    cnx({
    playerId: “106e33c0-3911-473c-b599-b1426db57530”,

    }).render(“0270c398a82f44f49c23c16122516796”);
    });

    36. George Lanter and the Very Quiet Ambulance (The Final Destination)

    Played by the great Mykelti Williamson, George Lantner is the only character who acts like a human being in The Final Destination. So it’s a bit lame that the movie kills him off with a gag when he steps onto the road and gets flattened by an oncoming ambulance. He mentions “deja vu” right before it happens because his end is a callback to a similar one from the first film, which will be talked about shortly. It’s an unimaginative death and a mean joke at the expense of a likable character, which lands it toward the bottom of the list.

    35. Nadia Monroy Makes Nick’s Dream a Reality (The Final Destination)

    For the most part, this list is ignoring both the premonitions and the mass casualties that occur after a premonition. The one exception comes with Nadia Monroy (Stephanie Honoré) of The Final Destination, who dies in the immediate aftermath of a premonition. After Nick has a vision of a massive Nascar wreck, he panics, which gets a group of people kicked out of the race just as the accident begins. As the survivors try to make sense of what happened, a tire flies out of the stadium and through Nadia’s head, replicating her death from the vision.

    34. Perry Malinowski Salutes the Flag (Final Destination 3)

    Final Destination loves its out-of-nowhere surprise kills. A character thinks they’re safe, they make some ironic statement and, bam, they’re immediately dead. Usually, these kills aren’t nearly as funny or clever as the movies think they are, especially compared to the elaborate sequences that have become the franchise’s calling card. One of the worst comes when Perry Malinowski (Maggie Ma) gets unceremoniously offed when a loose horse breaks of a flagpole that goes through her chest, a forgettable death for a forgettable character. Horse looks cool though.

    33. Darlene Campbell Stays at the Cabin (Final Destination Bloodlines)

    Although not as meta as, say, a Scream movie, the characters in Final Destination: Bloodlines know how Final Destination movies work. To the filmamkers’ credit, the knowledge adds tension to the movie, underscoring how knowledge doesn’t give them power to evade Death. Nowhere is that more clear than at the climax of Bloodlines when Darlene Campbell (Rya Kihlstedt)—a mother who has estranged herself from her children—decides to hide in her own mother’s bunker, thereby stalling Death’s hit list and saving her children. Noble though the sentiment may be, Darlene’s proclamation of love for her children distracts her, and she gets smashed by a falling pole, rendering her heroism moot.

    32. Carter Horton Finally Sees the Sign (Final Destination)

    Played by Kerr Smith, Carter Horton is the onscreen antagonist of the first film, an annoying preppie who bullies Alex and the others and somehow gets to survive. So while we don’t actually see Carter get killed before the screen cuts to closing credits, his demise does rank above those from the third and fourth movies just because we wanted to see this guy get it for so long.

    31. Samantha Lane Has Her Eye on a Stone (The Final Destination)

    The overwhelming majority of Final Destination victims are obnoxious, good-looking teens who mostly deserve to die. Wife and mother Samantha Lane (Krista Lane) certainly isn’t a saint but she doesn’t irritate us like every other jerk in The Final Destination. So we’re a bit annoyed that she gets such a cruel death when a lawn mower kicks up a rock that flies through her eyes while her young kids watch in horror. The kill does get a few extra points, however, for all of the playfulness before it actually happens, as Death sets up a few options to off Sarah before finally picking the rock.

    30. Ian McKinley Splits the Fair (Final Destination 3)

    The franchise has never done great with its human antagonists, the regular guys who get tired of all the dying and take things into their own hands by killing the other characters. Ian McKinley (Kris Lemche) stands out a little bit more than the others. Instead of showing all the things that could off him, the camera simply follows Ian through a crowd while he rants about his immortality. That’s a bit dull, but it pays off when a firework shoots by him, apparently sparing him, only for the explosion to knock over a cherry picker that splits him in half. That extra beat is enough to make his sudden surprise kill a bit more satisfying.

    29. Stefani and Charlie Reyes in a Logjam (Final Destination Bloodlines)

    Although a bit glossier and a bit kinder with its characters, Final Destination Bloodlines follows the beats of most entries in the franchise. In fact, its final moment, in which protagonists Stefani (Kaitlyn Santa Juana) and Charlie Reyes (Teo Briones) realize that they did not, in fact, stop Death and are about to die, feels like a callback to the infamous log premonition in Final Destination 2. However, Bloodlines ups the stakes with a lucky penny leading to a train derailment. The amazing shot of Stefani and Charlie goes bigger than any of the other movies’ shock ending, undone some by the cheap effects when two logs from the train car come loose and flatten our heroes.

    28. Sam Lawton and Emma Bell Die in a Callback (Final Destination 5)

    Final Destination 5 has the best ending of the series, in which protagonists Sam Lawton (Nicholas D’Agosto) and Emma Bell (Molly Harper) survive the ordeal and board a plane to celebrate. It’s only then that we realize that the movie has taken place in 2000 and that they’re boarding Flight 180, the one that explodes at the start of the first movie. Thus we have to watch as the characters who have gone through so much die, but we also get to see the original disaster that started it all. Emily splatters when she gets sucked out of the plane and sliced by the wing, but Sam’s death isn’t that spectacular outside of the fact that he burns up in the same manner as Alex did in his vision.

    27. Tod Waggner Hung Out to Dry (Final Destination)

    The first “real” death of the series, Tod Waggner’s (Chad E. Donella) end feels like a first draft to the spectacular kills to come. When water leaks from a toilet, Todd slips into the tub and gets a laundry cord wrapped around his neck. Todd’s desperate attempts to stand up and save himself, frustrated by the slick tub floor, give the death a level of pathos rarely seen in the series, but outside of that, it’s a fairly rote kill for the overall franchise.

    26. Iris Campbell Gets to the Point (Final Destination Bloodlines)

    Bloodlines gives Tony Todd a glorious final scene as Bloodworth, but it’s the elderly Iris Campbell (Gabrielle Rose) who tells her granddaughter Stefani the rules of Death’s design. Throughout the exposition dump, the camera points to various classic setups, but Iris catches them all. So when Death does finally take her, using a flying fire extinguisher to send a weathervane point through her face, it’s because Iris wants to show Stefani how Death operates. That intentionality makes Iris’ end stand out, even if it isn’t the most elaborate on this list.

    25. Rory Peters Goes Fencing (Final Destination 2)

    Final Destination 2 has the best premonition in the series, an incredible accident and pile-up filled with ghastly incidents. Toward the climax of the movie, that road destruction gets sort of recreated when a series of events launched by a car crash suddenly kill off other characters. It’s mostly fun, and wide shots let us see Death’s composition, but it’s hard to get too excited when stoner Rory Peters (Jonathan Cherry) gets split into thirds by flying fencing.

    24. Clear Rivers and Eugene Dix Go Up in Flames (Final Destination 2)

    It was a nice reveal to show Clear Rivers (Ali Larter) had survived even the post-credit carnage of the first Final Destination to provide information to the victims of the second film. But that surprise was completely undercut by the film then killing Clear in a sudden hospital explosion, taking teacher Eugene (T.C. Carson), one of the more compelling characters in the movie, out along with her. Multi-victim kills always feel like a bit of a cheat, but at least this one had a nice build-up.

    23. Carter Daniels’ Hate Crime Backfires (The Final Destination)

    The Final Destination‘s unlikable cast goes to the extreme when white supremacist Carter (Justin Welborn) singles out George Latner as the cause of his wife’s demise. So it’s especially satisfying when Carter, in the midst of burning a cross on George’s lawn, gets dragged behind his truck and burned alive. Carter may not get the most creative of kills, but rarely do we see such an awful person get their full and just reward like that.

    22. Isaac Palmer Meets the Buddha (Final Destination 5)

    Unlike most entries, Final Destination 5 limited its nastiness to one character, and even then, actor P. J. Byrne knows how to find light notes in his depiction of smarmy exec Isaac Palmer. Byrne sleezes it up as Isaac steals a spa coupon from recently-deceased co-worker, leers at spa workers, and then condescend to the worker who performs upon him. From then on, it’s a classic Final Destination sequence, as a fallen candle ignites spilled oil to send Isaac pin-first onto the ground, crawling away until he inadvertently pulls a Buddha statue on his head, his karma fully earned.

    21. Kat Jennings and the Jaws of Death (Final Destination 2)

    Nervous wreck Kat Jennings (Keegan Connor Tracy) gets one of the better sudden deaths in the series, largely because Death puts all the pieces in place for a symphony of chaos and then sets it off suddenly. Kat initially survives the car crash, avoiding the pointy pipe that ran through her back window and continues to stick out behind her head. When firefighters use the jaws of life to pry open her car door, however, the impact is enough to set off the airbags, slamming Kat’s head into the spike and setting off more carnage.

    20. Lewis Romero Loses Weight in the Gym (Final Destination 3)

    A lot of the kills on this list are preceded by a character declaring their immortality, but few do it with as much apblomb as Final Destination 3‘s aggro jock Lewis Romero (Texas Battle). Like many Lewis responds to Death’s machinations by asserting his own free will… loudly. At the end, he does it while pumping iron in the gym, and his protestations shake the walls, knocking free swords used as part of his team’s decor. The swords cut the bands of his machine as they fall, freeing the weights to smash his head. Given that it was his actions that made the swords drop, Lewis did kind of control his own fate.

    19. Nora Carpenter and the Creepy Hook Hand (Final Destination 2)

    Of all the kills on this list, the death of nervous mom Nora Carpenter (Lynda Boyd) seems the easiest to avoid. Well, at first anyway, when she rushes into an elevator and gets her hair caught on a hook, part of the prosthetic limbs that a creepy guy holds in a box. If Nora just settled down for a moment, or if the creepy guy would put as much effort into untangling her as he does smelling her hair, then she probably could have wrestled free before the elevator decapitated her. All that aside, it’s a pretty amazing and gory kill, one that has enough shock value to overcome any logistical leaps.

    18. Erin Ulmer Gets Nailed in the Head (Final Destination 3)

    The Final Destination movies are big on dying, but not so big on suffering, which is a good thing. We don’t want to think of these people as human beings, because that would ruin the fun of watching them go out. Erin Ulmer’s (Alexz Johnson) end in Final Destination 3 veers a bit too much toward suffering, as the camera holds on her as she moans in her last moments. Up until that point, though, the scene has fun with misdirection, making us think that we’re about to see Ian McKinley get crushed by boards until Erin gets knocked into a nail gun, which perforates the back of her head.

    17. Jonathan Groves Takes a Bath (The Final Destination)

    On one hand, Jonathan Groves (Jackson Walker) feels like he was added to The Final Destination late in production because the producers found out the movie’s running a bit too short. Groves does show up in the opening crash scene, but we lose track of him and assume he’s dead until Nick sees him on the news. But we can forgive the shoehorning for the purely absurd way that Groves goes out, with an overfilled bathtub from the hospital floor above crashing down onto his bed.

    16. Nathan Sears and Flight 180’s Landing (Final Destination 5)

    In addition to its fantastic kills Final Destination 5 also has the most well-rounded characters in the series, characters like junior executive Nathan Sears (Arlen Escarpeta). Nathan is fundamentally a nice guy but he gets caught up in a dispute with an older union leader, a dispute that ends when the leader accidentally dies during a fight. Thinking that was Death coming for him, Nathan comes to the leader’s wake to pay respects, secure in the belief that Death has skipped him. That assumption adds some pathos to the moment with gear from Flight 180 falls from the sky and crushes him, taking both good people and bad people.

    15. Frankie Cheeks Trapped in the Drive Thru (Final Destination 3)

    Frankie Cheeks (Sam Easton) is one of the most unlikable characters in the franchise (which is saying something) and we don’t even know that he’s dead until after it happens. So why does it rank relatively high on this list? Because of the way it’s set up, looking very much like protagonists Wendy and Kevin are going to get killed in an unbelievable but well-orchestrated drive-through accident. While our heroes escape in time, a collision still occurs, sending a huge engine fan into the back of Frankie’s head. At first it seems like the duo passed their death onto an innocent bystander until we see a bloody necklace in the shape of a naked lady, and we all breathe a sigh of relief that Frankie Cheeks walks the Earth no more.

    14. Tim Carpenter Gets Squished By Glass (Final Destination 2)

    Tim Carpenter may be the weirdest character in the entire series. The script says he’s 15, and actor James Kirk sometimes plays him as a teen and sometimes as an eight-year-old, which ends up feeling like he’s the MadTV character Stuart. That childlike nature leads to Tim’s end when, like a dumb kid, he just decides to chase after some pigeons because… they were there? The pigeons take flight, knocking a giant pane of glass off of a crane and sending the glass on top of Tim, smooshing the little weirdo.

    The biggest problem with The Final Destination is its reliance on CG blood, a scourge of 2000s horror. Still, sometimes the kills are so outrageous that we can forgive the poor effects. Such is the case when mechanic Andy Kewzer (Andrew Fiscella) gets blown into a chain link fence. It looks silly when his body collapses into goopy chunks, but the setup is satisfying, as is the sight of him getting blasted out of his garage into the instrument of his doom.

    12. Terry Chaney Hit By a Silent Bus (Final Destination)

    For the first viewers of Final Destination, Terry Chaney (Amanda Detmer) had the standout death. Freaked out by Alex’s talk of Death coming for them all, Terry tells her friends to drop dead, steps into the street and gets splattered by a bus. It’s a funny moment, as long as you don’t think about it for a second (none of her friends have peripheral vision? The bus driver doesn’t see the gesticulating lady backing into the street?), and it got cheers in the theater. Over time, however, the sudden shock death has become a series trope, dulling the impact (pun intended) of Terry’s end.

    11. Howard Campbell Gets a Trim (Final Destination Bloodlines)

    Patriarch Howard Campbell (Alex Zahara) gets the first classic-style death in Bloodlines, and what a glorious one it is. Occurring after the film has clearly laid out Death’s rules and process, the filmmakers luxuriate in the setup, taking time to highlight all of the things that could kill someone in Campbell’s well-appointed suburban backyard: a rake under a ripping trampoline, a shard of glass in an iced drink, a hose about to explode. After several minutes of anticipation, all of those things come together to set-off something we never saw coming, an electric self-propelled lawnmower, which runs over the face of the prone Howard.

    10. Billy Hitchcock Loses His Cool, Also His Head (Final Destination)

    Iconic as it may be, Terry’s isn’t the best sudden shock death in the first Final Destination movie. That honor belongs to New York Rangers superfan Billy Hitchcock (Seann William Scott), who also dies without much obvious setup from Death. Billy goes after he and Alex confront the ever-jerky Carter, who decides to defy Death by parking on train tracks. Carter survives, but Billy can’t take it and starts having an angry meltdown, a meltdown cut short when the train kicks up a piece of shrapnel and sends it flying through Billy’s neck.

    9. Valerie Lewton Learns About Kitchen Safety (Final Destination)

    Tod may be the first death in the Final Destination series, but Valerie Lewton (Kristen Cloke) gets the first great death of the franchise. Still shaken up over the explosion of Flight 180, teacher Mrs. Lewton spills some alcohol on the ground while making dinner. When her cooking goes awry, the alcohol ignites, setting her house ablaze. But it’s not the fire that kills her. Rather she dies when she accidentally pulls a knife down from the counter, which embeds itself in her chest.

    8. Evan Lewis Slips on Spaghetti (Final Destination 2)

    Sometimes Death orchestrates events in such an improbable manner that we can almost see a physical hand onscreen, manipulating events. Sometimes dumb people do dumb things and pay for it. It’s the latter event that brings down lottery-winning bro Evan Lewis (David Paetkau) in Final Destination 2, who just tosses a pot of spaghetti out the window. That decision proves disastrous when Death’s meddling leads to a fire in Evan’s apartment. Evan climbs out to make an escape, but he slips on his own spaghetti, which leaves him vulnerable to the falling ladder that pierces his eye.

    7. Brian Gibbons BBQ Bomb (Final Destination 2)

    Although it’s a sudden kill with little setup, the death of Brian Gibbons (Noel Fisher) ranks so high because of how funny it is. At the end of the movie, survivors Kimberly Corman (A.J. Cook) and Thomas Burke (Michael Landes) join the Gibbons family at a BBQ where they all let off a bit of steam. No sooner does Brian joke about his and his father’s near-death experience than the grill he’s using explodes, sending his severed arm flying through the air. The arm lands on his mother’s plate, a darkly funny beat that makes it one step better than the average out-of-nowhere kills in the series.

    6. Erik and Bobby Campbell Bond in the Hospital (Final Destination Bloodlines)

    Erik Campbell (Richard Harmon) is truly a unique character in the Final Destination franchise. First of all, he seems to survive his own elaborate death, a hilarious incident in a tattoo parlor (featured heavily in teasers). Secondly he and his brother Bobby (Owen Patrick Joyner) actually like each other, which makes their end so poignant.

    Off of Bludworth’s information, Erik decides to send the highly allergic Bobby into anaphylaxis so he can revive him, thus satisfying Death. But Erik gets too cute with his plan, and his action accidentally turns on and revs up an MRI machine in the room where the brothers are working. The intensified magnification first pulls in and crushes Erik, with his piercings in front and a wheelchair in back, and then snags a coil from a vending machine, sending it through Bobby’s head.

    5. Olivia Castle’s Laser-Guided Fall (Final Destination 5)

    Okay, technically Olivia Castle (Jacqueline MacInnes Wood) dies when she falls out of a window. But that’s not the part that sticks out in our mind. Instead we remember everything before that moment when Olivia gets laser eye surgery. As if torn from the worst thoughts of anyone about to get the surgery, we watch as Death shorts out the laser while the tech is out of the room and starts burning out Kimberly’s eye. No sooner does she escape than she slips on her beloved teddy bear and falls through the window, a somehow merciful end to the suffering.

    3. Ashley Freund & Ashlyn Halperin’s Tanning Session Gone Wrong (Final Destination 3)

    As this list shows, great Final Destination deaths fall into one of three categories: memorably mean, patently absurd, or impeccably designed. Ashley Fruend (Chelan Simmons) and Ashlyn Halperin (Crystal Lowe) are the prime examples of the first category. A pair of stock mean mall girls, Ashley and Ashlyn go to their favorite tanning spa, giant-size sodas in hand. Death ups the condensation on the drinks, which creates enough water to short out the beds, which turns up the heat, while a fallen shelf keeps them trapped inside. The sight of them burning alive is nasty enough, but the real kicker is the match cut at the end, which replaces two tanning beds with two coffins.

    3. Julia Campbell Takes Out the Trash (Final Destination Bloodlines)

    Final Destination movies love a good fake-out and Bloodlines has the best one yet. Armed with knowledge from Iris, Stefani walks down a suburban street with a skeptical Erik, Death’s next probable victim. As the two walk, Stefani points out all of the things that could kill him: leaves from a blower, a soccer ball kicked by kids, a trash compactor. But to Erik’s mocking glee, nothing happens. Nothing, that is, until Erik’s sister Julia (Anna Lore) goes for a run. In the background. And out of focus, all of those things come together to knock Julia into a roadside dumpster, which is then emptied into the garbage truck where Julia is compacted while Stefani watches.

    2. Hunt Wynorski’s Guts in a Pool Pump (The Final Destination)

    The best patently absurd kill in the entire franchise occurs to obnoxious bro Hunt Wynorski (Nick Zano). After getting into an altercation with a little kid at a public pool, Hunt sits down to catch some rays when he hears his lucky coin fall into the water. Hunt dives in after it, just as Death starts messing with the equipment, causing the pump to malfunction and raise the pressure. The pump traps Hunt at the bottom and he gestures wildly for help, but no one sees him. Instead of drowning, Hunt gets his guts sucked out through his butt, a kill so wonderful that we don’t even care about the CGI viscera that caps off the scene.

    1. Candace Hooper Doesn’t Stick the Landing (Final Destination 5)

    Easily the most glorious and well-composed kill of the entire franchise occurs early in Final Destination 5, when a standard routine for gymnast Candice Hooper (Ellen Wroe) goes horribly wrong. Director Steven Quale takes the time to show viewers the tools and space in which Death works, highlighting dripping water, a shaking girder, spilled dust, and other elements, before bringing them together as Candice goes through her flips. As a result, we understand every step in the system of catastrophes that leads to a ghastly end, with Candice’s crumpled body shuttering on the gym floor.

    The post Final Destination Kills Ranked from the Short and Sweet to Spectacularly Brutal appeared first on Den of Geek.