Category: Blog

Your blog category

  • The Wax and the Wane of the Web

    The Wax and the Wane of the Web

    When you begin to believe you have all figured out, everyone does change, in my experience. Simply as you start to get the hang of injections, diapers, and ordinary sleep, it’s time for solid foods, potty training, and nighttime sleep. When those are determined, school and occasional sleeps are in order. The cycle goes on and on.

    The same holds true for those of us who are currently employed in design and development. Having worked on the web for about three years at this point, I’ve seen the typical wax and wane of concepts, strategies, and systems. Every day we as developers and designers re-enter the familiar pattern, a brand-new engineering or thought emerges to shake things up and completely alter the world.

    How we got below

    I built my first website in the mid-’90s. Design and development on the web back then was a free-for-all, with few established norms. For any layout aside from a single column, we used table elements, often with empty cells containing a single pixel spacer GIF to add empty space. We styled text with numerous font tags, nesting the tags every time we wanted to vary the font style. And we had only three or four typefaces to choose from: Arial, Courier, or Times New Roman. When Verdana and Georgia came out in 1996, we rejoiced because our options had nearly doubled. The only safe colors to choose from were the 216 “web safe” colors known to work across platforms. The few interactive elements (like contact forms, guest books, and counters) were mostly powered by CGI scripts (predominantly written in Perl at the time). Achieving any kind of unique look involved a pile of hacks all the way down. Interaction was often limited to specific pages in a site.

    The development of online standards

    At the turn of the century, a new cycle started. Crufty code littered with table layouts and font tags waned, and a push for web standards waxed. Newer technologies like CSS got more widespread adoption by browsers makers, developers, and designers. This shift toward standards didn’t happen accidentally or overnight. It took active engagement between the W3C and browser vendors and heavy evangelism from folks like the Web Standards Project to build standards. A List Apart and books like Designing with Web Standards by Jeffrey Zeldman played key roles in teaching developers and designers why standards are important, how to implement them, and how to sell them to their organizations. And approaches like progressive enhancement introduced the idea that content should be available for all browsers—with additional enhancements available for more advanced browsers. Meanwhile, sites like the CSS Zen Garden showcased just how powerful and versatile CSS can be when combined with a solid semantic HTML structure.

    Server-side language like PHP, Java, and.NET took Perl as the primary back-end computers, and the cgi-bin was tossed in the garbage bin. With these improved server-side software, the first period of internet programs started with content-management techniques (especially those used in blogs like Blogger, Grey Matter, Movable Type, and WordPress ) In the mid-2000s, AJAX opened gates for sequential interaction between the front end and back finish. Pages had now revise their content without having to reload it. A grain of Script frameworks like Prototype, YUI, and ruby arose to aid developers develop more credible client-side conversation across browsers that had wildly varying levels of standards support. Techniques like photo replacement enable skilled manufacturers and designers to use fonts of their choosing. And technology like Flash made it possible to include movies, sports, and even more engagement.

    These new methods, requirements, and systems greatly reenergized the sector. Web style flourished as creators and designers explored more different styles and designs. However, we also relied heavily on numerous exploits. Early CSS was a huge improvement over table-based layouts when it came to basic layout and text styling, but its limitations at the time meant that designers and developers still relied heavily on images for complex shapes ( such as rounded or angled corners ) and tiled backgrounds for the appearance of full-length columns (among other hacks ). All kinds of nested floats or absolute positioning ( or both ) were necessary for complicated layouts. Display and photo substitute for specialty styles was a great start toward varying the designs from the big five, but both tricks introduced convenience and efficiency issues. Additionally, JavaScript libraries made it simple to add a dash of interaction to pages without having to spend the money to double or even quadruple the download size for basic websites.

    The web as software platform

    The interplay between the front end and the back end continued to grow, which led to the development of the current era of modern web applications. Between expanded server-side programming languages ( which kept growing to include Ruby, Python, Go, and others ) and newer front-end tools like React, Vue, and Angular, we could build fully capable software on the web. Along with these tools, there were additional options, such as shared package libraries, build automation, and collaborative version control. What was once primarily an environment for linked documents became a realm of infinite possibilities.

    Mobile devices increased in their capabilities as well, and they gave us access to the internet while we were traveling. Mobile apps and responsive design opened up opportunities for new interactions anywhere and any time.

    This fusion of potent mobile devices and potent development tools contributed to the growth of social media and other centralized tools for people to use and interact with. As it became easier and more common to connect with others directly on Twitter, Facebook, and even Slack, the desire for hosted personal sites waned. Social media provided connections on a global scale, with both positive and negative outcomes.

    Want a much more extensive history of how we got here, with some other takes on ways that we can improve? ” Of Time and the Web” was written by Jeremy Keith. Or check out the” Web Design History Timeline” at the Web Design Museum. A fun tour of” Internet Artifacts” is also available from Neal Agarwal.

    Where we are now

    It seems like we’ve reached yet another significant turning point in the last couple of years. As social-media platforms fracture and wane, there’s been a growing interest in owning our own content again. There are many different ways to create a website, from the tried-and-true classic of hosting plain HTML files to static site generators to content management systems of all varieties. The fracturing of social media also comes with a cost: we lose crucial infrastructure for discovery and connection. The IndieWeb‘s Webmentions, RSS, ActivityPub, and other tools can assist with this, but they’re still largely underdeveloped and difficult to use for the less geeky. We can build amazing personal websites and add to them regularly, but without discovery and connection, it can sometimes feel like we may as well be shouting into the void.

    Especially with efforts like Interop, browser support for CSS, JavaScript, and other standards like web components has increased. New technologies gain support across the board in a fraction of the time that they used to. I frequently find out about a new feature and check its browser support only to discover that its coverage is already over 80 %. Nowadays, the barrier to using newer techniques often isn’t browser support but simply the limits of how quickly designers and developers can learn what’s available and how to adopt it.

    With a few commands and a few lines of code, we can currently prototype almost any concept. All the tools that we now have available make it easier than ever to start something new. However, as the initial cost of these frameworks may be saved in the beginning, it eventually becomes due as their upkeep and maintenance becomes a component of our technical debt.

    If we rely on third-party frameworks, adopting new standards can sometimes take longer since we may have to wait for those frameworks to adopt those standards. These frameworks, which previously made it easier to adopt new techniques sooner, have since evolved into obstacles. These same frameworks often come with performance costs too, forcing users to wait for scripts to load before they can read or interact with pages. And when scripts fail ( whether due to poor code, network issues, or other environmental factors ), there is frequently no other option, leaving users with blank or broken pages.

    Where do we go from here?

    Hacks of today help to shape standards for the future. And there’s nothing inherently wrong with embracing hacks —for now—to move the present forward. Problems only arise when we refuse to acknowledge that they are hacks or when we refuse to take their place. So what can we do to create the future we want for the web?

    Build for the long haul. Optimize for performance, for accessibility, and for the user. weigh the price of those user-friendly tools. They may make your job a little easier today, but how do they affect everything else? What is the cost to the users? To future developers? to the adoption of standards? Sometimes the convenience may be worth it. It’s occasionally just a hack that you’ve gotten used to. And sometimes it’s holding you back from even better options.

    Start with the basics. Standards continue to evolve over time, but browsers have done a remarkably good job of continuing to support older standards. Not all third-party frameworks are the same. Sites built with even the hackiest of HTML from the’ 90s still work just fine today. The same can’t be said about websites created with frameworks even after a few years.

    Design with care. Consider the effects of each choice, whether your craft is code, pixels, or processes. The convenience of many a modern tool comes at the cost of not always understanding the underlying decisions that have led to its design and not always considering the impact that those decisions can have. Use the time saved by modern tools to think more carefully and make decisions with care rather than rushing to “move fast and break things”

    Always be learning. If you constantly learn, you also develop. Sometimes it may be hard to pinpoint what’s worth learning and what’s just today’s hack. Even if you were to concentrate solely on learning standards, you might end up focusing on something that won’t matter next year. ( Remember XHTML? ) However, ongoing learning opens up new neural connections in your brain, and the techniques you learn in one day may be used to inform different experiments in the future.

    Play, experiment, and be weird! The ultimate experiment is this web that we’ve created. It’s the single largest human endeavor in history, and yet each of us can create our own pocket within it. Be brave and try something new. Build a playground for ideas. In your own bizarre science lab, perform bizarre experiments. Start your own small business. There is no better place for being more creative, risk-taking, and expressing our creativity.

    Share and amplify. Share what you think has worked for you as you go through testing, playing, and learning. Write on your own website, post on whichever social media site you prefer, or shout it from a TikTok. Write something for A List Apart! But take the time to amplify others too: find new voices, learn from them, and share what they’ve taught you.

    Make a move and make it happen.

    As designers and developers for the web ( and beyond ), we’re responsible for building the future every day, whether that may take the shape of personal websites, social media tools used by billions, or anything in between. Let’s give everything we produce a positive vibe by infusing our values into everything we do. Create that thing that only you are uniquely qualified to make. Then distribute it, improve it, re-use it, or create something new with it. Learn. Make. Share. Grow. Rinse and repeat. Everything will change whenever you believe you have mastered the web.

  • Opportunities for AI in Accessibility

    Opportunities for AI in Accessibility

    I was completely moved by Joe Dolson’s subsequent article on the crossroads of AI and convenience, both in terms of the suspicion he has regarding AI in general and how many people have been using it. In fact, I’m very skeptical of AI myself, despite my role at Microsoft as an accessibility technology strategist who helps manage the AI for Accessibility award program. AI can be used in quite creative, inclusive, and accessible ways, as well as in harmful, exclusive, and harmful ways, like with any tool. And there are a lot of uses for the poor midsection as well.

    I’d like you to consider this a “yes … and” piece to complement Joe’s post. I’m just trying to contradict what he’s saying, but I’m just trying to give some context to initiatives and opportunities where AI can make a difference for people with disability. I want to take some time to talk about what’s possible in hope that we’ll get there one day. I’m no saying that there aren’t real challenges or pressing problems with AI that need to be addressed; there are.

    Other words

    Joe’s article spends a lot of time examining how computer vision models can create other words. He raises a lot of legitimate points regarding the state of the world right now. And while computer-vision concepts continue to improve in the quality and complexity of information in their information, their benefits aren’t wonderful. He argues to be accurate that the state of image research is currently very poor, especially for some image types, in large part due to the absence of contextual contexts in which to look at images ( as a result of having separate “foundation” models for words analysis and image analysis ). Today’s models aren’t trained to distinguish between images that are contextually relevant ( which should probably have descriptions ) and those that are purely decorative ( which might not even need a description ) either. However, I still think there’s possible in this area.

    As Joe points out, human-in-the-loop publishing of ctrl text should definitely be a factor. And if AI can intervene and provide a starting point for alt text, even if the quick reads,” What is this BS?” That’s not correct at all … Let me try to offer a starting point— I think that’s a gain.

    If we can specifically station a design to examine image usage in context, it might help us more quickly determine which images are likely to be elegant and which ones are likely to need a description. That will help clarify which situations require image descriptions, and it will increase authors ‘ effectiveness in making their sites more visible.

    While complex images—like graphs and charts—are challenging to describe in any sort of succinct way ( even for humans ), the image example shared in the GPT4 announcement points to an interesting opportunity as well. Let’s say you came across a map that merely stated the chart’s name and the type of representation it was:” Pie chart comparing smartphone use to have phone usage in US households making under$ 30, 000 annually.” ( That would be a pretty bad alt text for a chart because it would frequently leave many unanswered questions about the data, but let’s just assume that that was the description in place. ) If your browser knew that that image was a pie chart ( because an onboard model concluded this ), imagine a world where users could ask questions like these about the graphic:

    • Do more people use smartphones or other types of smartphones?
    • How many more are there?
    • Is there a group of people that don’t fall into either of these buckets?
    • What number is that?

    For a moment, the chance to learn more about images and data in this way could be revolutionary for people who are blind and low vision as well as for those with various forms of color blindness, cognitive disabilities, and other issues. Putting aside the realities of large language model ( LLM) hallucinations, where a model just makes up plausible-sounding “facts,” It could also be useful in educational contexts to help people who can see these charts, as is, to understand the data in the charts.

    What if you could ask your browser to make a complicated chart simpler? What if you asked it to separate a single line from a line graph? What if you could ask your browser to transpose the colors of the different lines to work better for form of color blindness you have? What if you asked it to switch colors in favor of patterns? That seems like a possibility given the chat-based interfaces and our current ability to manipulate images in today’s AI tools.

    Now imagine a purpose-built model that could extract the information from that chart and convert it to another format. For instance, it might be able to convert that pie chart (or, better yet, a number of pie charts ) into more usable ( and useful ) formats, like spreadsheets. That would be incredible!

    Matching algorithms

    When Safiya Umoja Noble chose to put her book Algorithms of Oppression, she hit the nail on the head. Although her book focused on how search engines can foster racism, I believe it’s equally true that all computer models have the potential to foster conflict, prejudice, and intolerance. Whether it’s Twitter always showing you the latest tweet from a bored billionaire, YouTube sending us into a Q-hole, or Instagram warping our ideas of what natural bodies look like, we know that poorly authored and maintained algorithms are incredibly harmful. Many of these are the result of a lack of diversity in the people who create and build them. There is still a lot of potential for algorithm development when these platforms are built with inclusive features in mind.

    Take Mentra, for example. They serve as a network of employment for people who are neurodivers. They match job seekers with potential employers using an algorithm based on more than 75 data points. On the job-seeker side of things, it considers each candidate’s strengths, their necessary and preferred workplace accommodations, environmental sensitivities, and so on. It takes into account the workplace, the communication environment, and other factors. Mentra made the decision to change the script when it came to the typical employment websites because it was run by neurodivergent people. They use their algorithm to propose available candidates to companies, who can then connect with job seekers that they are interested in, reducing the emotional and physical labor on the job-seeker side of things.

    When more people with disabilities are involved in developing algorithms, this can lower the likelihood that these algorithms will harm their communities. Diverse teams are crucial because of this.

    Imagine that a social media company’s recommendation engine was tuned to analyze who you’re following and if it was tuned to prioritize follow recommendations for people who talked about similar things but who were different in some key ways from your existing sphere of influence. For instance, if you followed a group of nondisabled white male academics who spoke about AI, it might be advisable to follow those who are disabled, aren’t white, or aren’t men who also speak about AI. If you followed its advice, you might be able to understand what is happening in the AI field more fully and nuancedly. These same systems should also use their understanding of biases about particular communities—including, for instance, the disability community—to make sure that they aren’t recommending any of their users follow accounts that perpetuate biases against (or, worse, spewing hate toward ) those groups.

    Other ways that AI can assist people with disabilities

    If I weren’t attempting to combine this with other tasks, I’m sure I could go on and on, giving various examples of how AI could be used to assist people with disabilities, but I’m going to make this last section into a bit of a lightning round. In no particular order:

      preservation of voice You may be aware of the voice-prescribing options from Microsoft, Acapela, or others, or you may have seen the announcement for VALL-E or Apple’s Global Accessibility Awareness Day. It’s possible to train an AI model to replicate your voice, which can be a tremendous boon for people who have ALS ( Lou Gehrig’s disease ) or motor-neuron disease or other medical conditions that can lead to an inability to talk. We need to approach this tech responsibly because it has the potential to have a truly transformative impact, which is why it can also be used to create audio deepfakes.
    • voice recognition is. Researchers like those in the Speech Accessibility Project are paying people with disabilities for their help in collecting recordings of people with atypical speech. As I type, they are actively seeking out people who have Parkinson’s and related conditions, and they intend to expand this list as the project develops. More people with disabilities will be able to use voice assistants, dictation software, and voice-response services, as well as to use only their voices to control computers and other devices, according to this research.
    • Text transformation. The most recent generation of LLMs is capable of altering already-existing text without giving off hallucinations. This is incredibly empowering for those who have cognitive disabilities and who may benefit from text summaries or simplified versions, or even text that has been prepared for bionic reading.

    The importance of diverse teams and data

    We must acknowledge that our differences matter. The intersections of the identities we exist in have an impact on our lived experiences. These lived experiences—with all their complexities ( and joys and pain ) —are valuable inputs to the software, services, and societies that we shape. Our differences must be reflected in the data we use to develop new models, and those who provide it need to be compensated for doing so. More robust models are produced by inclusive data sets, which promote more justifiable outcomes.

    Want a model that doesn’t demean or patronize or objectify people with disabilities? Make sure that you include information about disabilities that has been written by people with a variety of disabilities in the training data.

    Want a model that uses ableist language without using it? You may be able to use existing data sets to build a filter that can intercept and remediate ableist language before it reaches readers. Despite this, AI models won’t soon replace human copy editors when it comes to sensitivity reading.

    Want a coding copilot who can provide you with useful recommendations after the jump? Train it on code that you know to be accessible.


    I have no doubt that AI has the potential to harm people today, tomorrow, and long into the future. However, I also think that we can acknowledge this and make thoughtful, thoughtful, and intentional changes in our approaches to AI that will reduce harm over time as well. Today, tomorrow, and well into the future.


    Many thanks to Kartik Sawhney for supporting the development of this article, Ashley Bischoff for providing me with invaluable editorial support, and, of course, Joe Dolson for the prompt.

  • The Ultimate Buyer’s Guide for Transitioning to Fractional CMO Services

    The Ultimate Buyer’s Guide for Transitioning to Fractional CMO Services

    Learn more at Duct Tape Selling about John Jantsch’s The Ultimate Buyer’s Guide for Moving to Fractional CMO Services.

    Introduction to the Fractional CMO Model Major Fractional CMO Training Companies System Comparison Table Framework evaluations The 4 CMO Models Decision-Making Advice Application Essentials Making the Final Decision Conclusion More Resources Introduction The Growing Fractional CMO Landscape The finite CMO design offers high-level branding strategy without the expense of a full-time hire. ]… ]

    Learn more at Duct Tape Selling about John Jantsch’s The Ultimate Buyer’s Guide for Moving to Fractional CMO Services.

    Advantages: The Fractal CMO Landscape Is Changing

    Without the expense of a full-time employee, the partial CMO concept offers high-level advertising strategy. This link compares the various alternatives for teaching and advises you on the best course of action.

    Understanding the Fractional CMO Model

    A part-time corporate marketing executive is a partial CMO. Cost-effective authority is advantageous for businesses, and consultants are able to play high-value roles.

    A partial CMO is a part-time selling executive hired by companies to implement strategy without paying full-time. &#8221, – Casey Stanton, CMOx

    Major Fractional CMO Training Companies

    Duct Tape Selling

    offers the Strategy Second Leadership Accelerator with an emphasis on positioning professionals as proper officials.

    CMOx

    provides a method for consolidation and implementation with an emphasis on tactical execution and audits.

    DigitalMarketer

    provides electric execution systems and the Client Value Journey to engage and convert clients.

    Comparison Board for the Program

    Provider Framework Client Orientation Excellent For Investment Community
    Duct Tape Selling Strategy Second Retainer based on technique Strategic Consultants $9,000+ ✔400+ Experts
    CMOx Functional Advertising Tactical Plan and Audit Application Experts $10,000 ✔Facebook Group
    DigitalMarketer Client Value Journey 90-day Onboarding Retainer Digital Professionals Varies ✔Qualified Partner Network

    Framework evaluations

    • Strategy Second (DTM): Strategic first, implementation later
    • Functional Advertising (CMOx): Audit-driven, systemized execution
    • Client Value Journey (DM): Nurture through digital sales funnel

    The 4 partial CMO Models

      Independent: One specialist per customer set

    1. Agency: Cm as a support via accounts directors
    2. Collective: Personal experts, a shared brand
    3. Organized Company: Team-based reference model

    Decision-Making Advice

    • Assess your knowledge, mission, and objectives for the clientele.
    • Make sure plans provide systems, resources, and help.
    • Avoid rigorous programs and overhyped promises.

    Application Essentials

    • Definition of packages: fee, cross, and strategy-only
    • Use sales tiers:$ 5k–$ 15k tasks,$ 3k–$ 15k servants
    • Acquire clients: information, recommendations, reach
    • Give using phased plans: Assess, Plan, and Execute

    Making the Final Decision

    Choose a plan that best fits your corporate objectives, ease with the strategy, and ROI expectations. Community assistance is crucial.

    It’s time to cease selling your time, and instead, to begin selling your knowledge. &#8221, – John Jantsch

    Conclusion

    For experts and agencies, finite CMO work represents a geopolitical change. Choose the best system, implement a consistent framework, and responsibly expand your effect and income.

    More Information

    • Books: The Fractional CMO Method, Duct Tape Selling
    • Societies: DTM Network, CMOx Group,
    • Podcasts: Duct Tape Selling Podcast, The Fractional CMO Show
    • Tools: Strategy Second templates, CVJ maps, audit docs
  • From Beta to Bedrock: Build Products that Stick.

    From Beta to Bedrock: Build Products that Stick.

    I’ve lost count of the times when promising ideas go from being useless in a few months to being useless after working as a solution designer for too long to notice.

    Financial items, which is my area of expertise, are no exception. It’s tempting to put as many features at the ceiling as possible and hope someone sticks because people’s true, hard-earned money is on the line, user expectations are high, and a crammed market. However, this strategy will lead to disaster. Why, you see this:

    The drawbacks of feature-first growth

    It’s simple to get swept up in the enthusiasm of developing innovative features when you start developing a financial product from scratch or are migrating existing user journeys from papers or telephony channels to online bank or mobile apps. They may believe,” If I may only add one more thing that solves this particular person problem, they’ll enjoy me”! But what happens if you eventually encounter a roadblock as a result of your security team’s negligence? not like it? When a difficult-fought film fails to win over viewers or fails due to unanticipated difficulty?

    The concept of Minimum Viable Product ( MVP ) is applied to this. Even though Jason Fried doesn’t usually refer to it that way, his podcast Rework and his book Getting Real frequently address this concept. An MVP is a product that offers only enough value to your users to keep them interested, but not so much that it becomes difficult to keep up. Although the idea seems simple, it requires a razor-sharp eye, a ruthless edge, and the courage to stand up for your position because it is easy to fall for” the Columbo Effect” when there is always” just one more thing …” to add.

    The issue with most fund apps is that they frequently turn out to be reflections of the company’s internal politics rather than an encounter created specifically for the customer. This implies that the priority is to provide as many features and functionalities as possible to satisfy the requirements and desires of competing inside ministries as opposed to a distinct value statement that is focused on what people in the real world actually want. These products may therefore quickly become a muddled mess of confusing, related, and finally unlovable client experiences—a feature salad, you might say.

    The significance of the foundation

    What is a better strategy, then? How may we create products that are user-friendly, firm, and, most importantly, stick?

    The concept of “bedrock” comes into play in this context. The mainstay of your product is really important to people, and Bedrock is that. It’s the fundamental building block that creates price and maintains relevance over time.

    The core has to be in and around the standard servicing journeys in the world of retail bank, which is where I work. People only look at their existing accounts once every five minutes, but they also look at it daily. They purchase a credit card every year or every other year, but they at least once a month examine their stability and pay their bills.

    The key is in identifying the main tasks that individuals want to complete and therefore relentlessly striving to make them simple, reliable, and trustworthy.

    How can you reach the foundation, though? By focusing on the” MVP” strategy, giving convenience precedence, and working incrementally toward a clear value proposition. This means avoiding pointless extras and putting your clients first, making the most of them.

    It also requires some nerve, as your coworkers might not always agree on your eyesight right away. And dubiously, occasionally it can even suggest making it clear to customers that you won’t be coming to their house and making their breakfast. Sometimes you need to use “opinionated user interface design” ( i .e., clumsy workaround for edge cases ) to test a concept or to give yourself some more time to work on something else.

    Functional methods for creating stick-like economic products

    What are the main learnings I’ve made from my own research and practice?

    1. What trouble are you trying to solve first, and make a distinct “why”? Who is it for? Before beginning any project, make sure your goal is completely clear. Make certain it also aligns with the goals of your business.
    2. Avoid the temptation to put too many functions at once by focusing on one, key feature and focusing on getting that right before moving on to something else. Choose one that actually adds price, and work from that.
    3. When it comes to financial items, clarity is often over richness. Eliminate unwanted details and concentrate solely on what matters most.
    4. Accept constant iteration as Bedrock is a powerful process rather than a set destination. Continuously collect customer opinions, make improvements to your product, and move toward that foundation.
    5. Stop, look, and listen: Don’t just go through with testing your product as part of the delivery process; test it consistently in the field. Use it for yourself. Move the A/B testing. User opinions on Gear. Speak to the users of it and make adjustments accordingly.

    The “bedrock dilemma”

    This is an intriguing conundrum: sacrificing some of the potential for short-term growth in favor of long-term stability. But the return is worthwhile: products built with a focus on rock will outlive and surpass their rivals over time and provide users with long-term value.

    How do you begin your quest for rock, then? Consider it gradually. Start by identifying the essential components that your customers actually care about. Focus on developing and improving a second, potent function that delivers real value. And most importantly, make an obsessive effort because, whatever you think, Abraham Lincoln, Alan Kay, or Peter Drucker, you can’t deny it! The best way to foretell the future is to make it, he said.

  • Hacks’ Julianne Nicholson Is Clearly Having the Time of Her Life as Dance Mom

    Hacks’ Julianne Nicholson Is Clearly Having the Time of Her Life as Dance Mom

    Spoilers appear in this instance 7 of Hacks season 4. Hollywood shifts people. However, Dance Mom on Hacks has hardly ever changed someone more rapidly than the glitter and glamour of Tinseltown. In the fourth episode of the fourth season of this beloved comedy on HBO Max ( Hey, we get to call it” HBO Max” again! ), ]…]

    The first post on Den of Geek: Julianne Nicholson Is Obviously Having the Time of Her Life as Dance Mom was shared.

    Recently a friend mentioned how much of a pity it was that, generally speaking, there are few of those secret” classic” reimaginings now like the ones we had growing up. And after a brief moment of reflection, I agreed. Children and teens of the ‘ 90s were treated to an embarrassment of treasures when it came to the Bard and Bard-adjacent pictures. Almost every week seemed to give another development of William Shakespeare, Jane Austen, or Geoffrey Chaucer, all retrofitted with a smile and a push to charm to youth reading much the same writings in high school or university.

    However, when one considers the breadth of 1990s film beyond “teen movies,” it was more than just the vehicles starring Heath Ledger and Julia Stiles that were receiving the traditional treatment. In fact the ‘ 90s, and to a large extent the ‘ 80s as well, was an era ripe with indie studios and Hollywood majors treating classic literature ( if largely of the English variety ) with the sanctity nowadays reserved for comic books and video games. Some of the most creative or ambitious artists in the industry fought against the sluggishness of New Hollywood from a decade or two earlier in favor of the even more terrible considerations of bras and major hats.

    cnx. command. cnx ( playerId:” 106e33c0-3911-473c-b599-b1426db57530 ), ) -push ( function ( ). render ( “0270c398a82f44f49c23c16122516796” ), }),

    We saw some of the most honest and enduring alterations of Dickens or Louisa May Alcott making it on screen, and Shakespeare was unquestionably a bigger company in tinsel area than at any other time during this time. Why is that and can it occur again? This look back at the golden age of time item costumed dramas and colorful artistic adjustments…

    Helena Bonham Carter in A Room with a View

    Mozart and Merchant Ivory

    Since the beginning of the platform, moviemakers have looked up at well-worn and common stories for ideas and market experience. In 1907, Georges Méliès adapted Hamlet into a roughly 10-minute silent short after making his enduring trip to the moon. And of course before Kenneth Branagh, Laurence Olivier had Hollywood falling in love with the Bard… at least as long it was Larry in the tights.

    Even so, literary adaptations were often constrained, particularly in Hollywood where filmmakers had to contend with the limitations of censorship via the Hays Code and preconceived notions about what an American audience would enjoy. Therefore, the most well-known costumed dramas were typically vanity projects or something with a more sensational tone, like the biblical or the swords and sandals epics.

    So it’s difficult to point to an exact moment where that changed in the 1980s, yet we’d hazard to suggest the close together Oscar seasons of 1984 and 1986 had a lot to do with it. After all, the first was the year Milo Forman’s Amadeus won Best Picture, and the second was the year James Ivory and Ismail Merchant’s lush adaptation of E. M. Forster’s A Room with a View festigt our conception of what a” Merchant Ivory” film could be. Considered by Forster scholars one of the author’s slighter works, the film had critics like Roger Ebert swooning that it was a masterpiece.

    In the case of Amadeus, the director of One Flew Over the Cuckoo’s Nest ( 1975 ) —a zeitgeist-shaping portrait of modern oppression and control from about a decade earlier —was taking the story of Mozart and making it a punk rock tragicomedy. Forman and Shaffer radically reimagined the story, making it funnier and darker as Forman attempted to pose Mozart as a modern-day rebel iconoclast with his wig resembling Sid Vicious as the Age of Enlightenment, in a play by the same name. Located atop Tom Hulce’s giggling head, it signaled a movie that had all the trappings of melodrama but felt accessible and exciting to a wide modern audience.

    It then continued to do relatively well and win Best Picture. While not the first period film to do so, it was the first in a long while set in what could be construed as the distant past ( Richard Attenborough’s Gandhi won the year before but that was based on a subject matter in the living memory of most Academy voters ). Otherwise, most of the recent winners were dramas or dramedies about the modern world: Kramer vs. Kramer ( 1979 ), The Deer Hunter ( 1978 ), and Annie Hall ( 1977 ). They were a response to a viewer who wanted to escape the artificiality of their parents ‘ films, which in the U.S. associated historical costumes with the ( grand ) phoniness of Ben-Hur ( 1959 ) or Oliver! ( 1968 ).

    However, the British masterpiece A Room with a View, which established this as the start of a well-known trend, was released a few years later. To be sure, the partnership of Merchant and Ivory had been going for more than 20 years by the time they got to adapting Forster, including with several other costumed dramas and period pieces. However, those films were mixed with modern comedies and dramas like rock’ n roll-infused The Guru ( 1969 ) and Jane Austen in Manhattan ( 1980 ). Importantly, all of these movies were typically small chamber pieces made for a select group of people.

    Yet as the marketing campaign would later trumpet about A Room with a View—the ethereal romantic dramedy which introduced Daniel Day-Lewis and a fresh-faced Helena Bonham Carter to the U. S. —this movie had the “highest single theatre gross in the country”! ( It’s fun to recall a time when a movie could be a hit in New York if it were just selling out every day. ) The film’s combination of Forster’s wry satire and cynicism about English aristocracy in the late Victorian and early Edwardian era, coupled with the sweeping romance of Puccini arias and Tuscan countrysides, made it a massive success.

    It also defined what became the” Merchant Ivory” period piece forever after, including in future Oscar and box office darlings like the Anthony Hopkins, Emma Thompson, and Carter-starring Howard’s End ( 1992 ), and Hopkins and Thompson’s reunion in The Remains of the Day ( 1993 ). Remains was an outright tragedy delivered in a hushed whisper, but these were all distinctly British and understated films, which signaled to Hollywood that there was gold up in’em hills. And soon enough, more than just Forman on the American side was going up there to mine it.

    Wes Studi in Last of the Mohicans
    20th Century Studios

    Martin Scorsese, Michael Mann, and the Auteur’s Costumed Drama

    In 1990, Michael Mann was one of the hottest creatives working in Hollywood. He helped NBC’s edgy ( by ’80s standards ) police drama, Miami Vice, become the “gritty” and artistic version of American television. Even the episodes he didn’t helm were defined by the standards he insisted upon—such as never putting cool guys Crockett and Tubbs in a red or brown car. It would conflict with Mann’s neon-light-on-celluloid aesthetic, which he created for the series.

    As that series was winding down by 1990, Mann was more in demand than ever to make any film project he might have wanted—something perhaps in-keeping with Vice or gritty crime thrillers he’d made in the &#8217, 80s like serial killer thriller Manhunter ( 1986 ). Instead he sought to adapt a childhood favorite for the screen, James Fenimore Cooper’s 19th century American frontier novel, The Last of the Mohicans. In its original form, the text served as a launching pad for filmmakers to create a gripping, primal, and prestigious film with its imperial-fantasy riff on the French and Indian War ( or Seven Years War ), in which indigenous tribes in what is now upstate New York were either reduced to the noble or cruel savage stereotypes.

    He also made a movie that far exceeded its source material with The Last of the Mohicans being an often wordless opera of big emotions played in silence by Day-Lewis, Madeleine Stowe, and Wes Studi, all while Trevor Jones and Randy Edelman’s musical score looms like thunderclouds across the mountainous landscape. It is a beautiful drama and a high-profile action film, and it did more business in the United States than Tom Cruise’s A Few Good Men and Disney’s Beauty and the Beast. It also would create a precedent we&#8217, d see followed time and again throughout the rest of the decade.

    Some of the biggest and most respected filmmakers of the moment, many of them praised under auteur theory, were looking to literary classics for an audience that craved them. One of Martin Scorsese‘s most ambitious and underappreciated films was the 1993 masterpiece The Age of Innocence, which was an inspiration from an Edith Wharton novel.

    It’s a story that Scorsese argues is just as brutal, if not more so, than his gangster pictures. The Age of Innocence, in fact, remains the best film adaptation of the Gilded Age in cinema. It captures the lush pageantry of the most wealthy New Yorkers ‘ heyday as well as how class sectarian prejudice developed into ruthless tribalism, which ultimately led to the romantic apprehensions of one conformist attorney (once again Daniel Day-Lewis ) and this would-be-divorced lover ( Michelle Pfeiffer ).

    It might not have been a hit in its time, but Ang Lee’s breakout in the U. S. a year later definitely was. The Taiwanese filmmaker was already the toast of international and independent cinema via movies like The Wedding Banquet ( 1993 ) and martial arts-adjacent Pushing Hands ( 1991 ), but it is when he directed a flawless adaptation of Jane Austen’s Sense and Sensibility in 1995 that he became a Hollywood favorite who would soon get movies like Crouching Tiger, Hidden Dragon ( 2000 ) and Hulk ( 2003 ) greenlit. Emma Thompson, Hugh Grant, Kate Winslet, and Alan Rickman make a great ensemble, and Sinn and Sensibility benefits greatly from a fantastic cast as well. It also captured the sophisticated satirical and melancholic underpinnings of Austen’s pen that most previous Hollywood adaptations never scratched.

    It established a standard by which the best Austen adaptations to this day are measured, whether it be Joe Wright and Keira Knightley’s cinematic adaptation of Pride and Prejudice from the 1990s with Gwyneth Paltrow or Netflix’s most recent Persuasion adaptation starring Dakota Johnson.

    Lucy in Bram Stoker's Dracula
    Columbia / Sony

    A Dark Universe of Gods and Monsters

    The same studio signed off on its first period piece with Winona Ryder attached to star right before Columbia Pictures approved Scorsese’s film The Age of Innocence and later Gillian Armstrong’s still delightful ( and arguably definitive ) interpretation of Little Women in 1994. And it was Dracula.

    Bram Stoker’s Dracula was Francis Ford Coppola‘s wacky and magnificent reimagining of Stoker’s definitive Victorian novel, which was then viewed as a sneer of hubris by rivals who snickered at it. Published in 1897 with on-the-nose metaphors for London society’s anxieties over foreigners, sexual promiscuity and disease, and the so-called” New Woman” working in the professional classes, Coppola saw all of that potential in the well-worn and adapted vampire novel. He also correctly predicted there was a box office hit if he could bring all those elements out in an exciting and anachronistic fever dream for the MTV generation.

    One of the most lavish and expensive depictions of Victorian society ever made onscreen, winning costume designer Eiko Ishioka an Oscar for the effort, is whether you like or hate Coppola’s looseness with Stoker’s novel, which is pretty audacious given the author’s name in the title. He also made an unexpected holiday hit that played like bloody gangbusters alongside Home Alone 2 and Aladdin that winter.

    It established a standard for what can in retrospect be regarded as a pseudo-“dark universe” of classic literary monsters getting ostensibly faithful and expensive adaptations by Hollywood. Coppola himself produced Kenneth Branagh’s Mary Shelley’s Frankenstein ( 1994 ), a film that is actually in many ways closer to the thematic letter of its author than Bram Stoker’s Dracula ever was. It was also a worse movie that flopped, but it looked spectacular as the only major Frankenstein movie to remember Shelley set the story during the Age of Enlightenment in the late 18th century.

    Tom Cruise and Neil Jordan would be successful in adapting Anne Rice’s Interview with the Vampire in the same year as Frankenstein did, despite the failure of the film. The book admittedly was recent, having been published in 1976, but the story’s roots and setting in 18th and 19th century bayou occultism were not. In a scene dripping in homoeroticism, the actor who played Top Gun‘s Maverick would stick fangs into a young Brad Pitt’s neck.

    This trend continued throughout the’ 90s with some successes, like Tim Burton’s wildly revisionist ( and Coppola-produced ) Sleepy Hollow in 1999, and some misses. For instance, did you remember that Julia Roberts at the height of her stardom appeared in a revisionist take on Robert Louis Stevenson’s The Strange Case of Dr. Jekyll and Mr. Hyde where she played the not-so-good doctor’s maid? By the way, it’s called Mary Reilly ( 1996) ).

    Denzel Washington and Keanu Reeves in Much Ado About Nothing
    The Samuel Goldwyn Company

    Shakespeare’s Resurrection

    Of course when talking about classic literature and storytelling, one name rises above most others in the schools and curriculums of the English-speaking world. Yet curiously it was only in the 1990s that someone really lit on the idea of making a movie directly based on the Bard tailored almost exclusively for that demographic: Baz Luhrmann in 1996, who reconfigured the tragedy of Romeo and Juliet into the visual language of MTV. He even altered the title to reflect William Shakespeare’s Romeo + Juliet.

    That proved the tip of an anachronistic iceberg whose cast included Leonardo DiCaprio at the height of his heartthrob powers as Romeo and real-life teenager Claire Danes as his Capulet amore. With hyper music video editing and frenetic neon-hued melodrama, they created a Neverland composite of Miami, Rio de Janeiro, and the nightly news. Some older scholars viewed Luhrmann’s anachronisms as an abomination, but as a Millennial, I can attest we loved this thing back in the day. Many still do.

    But Shakespeare’s work did not quite make it to the top of the box office in the 1990s. When the decade began, the helmer of another cinematic Romeo and Juliet classic from a different era, Franco Zeffirelli, attempted to make Hamlet exciting for “kids these days” by casting Mel Gibson right in the midst of his Lethal Weapon popularity as the indecisive Dane. It’s difficult to remember Gibson as a heartthrob of sorts in the 1980s and early 1990s or as a star-dwelling hero worthy of heroic leading man roles in today’s world.

    Nonetheless, there is quite a bit to like about Hamlet ( 1990 ) if you can look past Gibson’s off-screen behavior in the following decades, or the fact Zeffirelli cuts what is a four-hour play down to less than 2.5 hours. Gibson actually makes for a credible and genuinely mad Hamlet ( perhaps not a surprise now ), and Zeffirelli mines the medieval melancholy of the story well with production design, costumes, and location shooting at real Norman castles. Helena Bonham Carter is also still the best screenplay ever made. Hamlet ( 1990 ) would eventually be overshadowed, though, both by Gibson’s awful behavior and because of a much grander and bombastic adaptation from the man who became the King of Shakespeare Movies in the’ 90s: Kenneth Branagh.

    Aye, Branagh might get the most credit for the Shakespearean revival in this era, starting with his 1989 adaptation of Henry V, which featured Derek Jacobi, Brian Blessed, and of course his future wife ( and ex ), Emma Thompson. Together the pair would mount what is in this writer’s opinion the best film ever based on a Shakespeare play, the divine and breezy Much Ado About Nothing ( 1993 ), a perfect encapsulation of perhaps the first romantic comedy ever written that features Branagh and Thompson as the sharp-tongued, dueling lovers Benedict and Beatrice. It also features Denzel Washington as a dashing Renaissance prince, Kate Beckinsale in her breakout role, and a gloriously over-the-top score by Patrick Doyle.

    In the 70mm-filmed, ultra wide and sunny adaptation of Hamlet he helmed in 1996, Branagh’s following 1990s efforts would be defined by their direction whether they went off-the-rails like in the aforementioned Frankenstein or right back on them. Avoiding the psychological and Freudian interpretations of the Danish prince chased by Olivier and Zeffirelli, Branagh turns Hamlet into a romantic hero spearheading an all-star ensemble cast. Hamlet ( 1996 ) is indulgent at its full four-hour length. Yet somehow that befits the material. Branagh would also star as Iago in Oliver Parker’s Othello ( 1995 ) opposite Laurence Fishburne and reconfigure the Bard as a musical in his own directorial effort, Love’s Labour’s Lost ( 2000 ).

    By the end of the decade, Julie Taymor&#8217, Titus ( 1999 ), and A Midsummer Night’s Dream ( 1999 ), in which Kevin Kline turns into an ass and pretends to be Michelle Pfeiffer, were all paved the way for more unconventional Shakespeare films.

    Paul Rudd and Alicia Silverstone in Clueless
    CBS via Getty Images

    The Teenage Shakespeare Remix ( and Austen, Chaucer, and others ): The Birth of the…

    As popular as the Shakespeare movie became in the’ 90s, what’s curiously unique about this era is the simultaneous rise of movies that adapted either the Bard or other highly respected literary writers and turned them into a pure teenage dream. We’re talking moving past modernizing Romeo and Juliet like Luhrmann did, or repurposing it for high New York society like Leonard Bernstein and Stephen Sondheim aimed with West Side Story.

    These were straightforward, unapologetic youth movies that also served as clever rehash of traditional storytelling techniques. Among the best directly derived from Shakespeare is the movie that made Julia Stiles and Heath Ledger Gen-X icons, 10 Things I Hate About You ( 1999 ), a happily campy update of The Taming of the Shrew set in a fairytale high school also populated by future Christopher Nolan favorites like Joseph Gordon-Levitt and David Krumholtz. In fact, Tiles would do this kind of remix a few times in the more serious-faced modernization of Hamlet ( 2000 ), the third Hamlet film in ten years, but this one set in turn-of-the-century NYC, and Othello, O ( 2000 ), which also starred Mekhi Phifer as a tragic distrusting high school sports star instead of warrior.

    Ledger also returned to the concept by adapting another, even older literary giant, in this case the medieval poet Geoffrey Chaucer, for A Knight’s Tale ( 2001 ), an anachronistic blending of the medieval and modern where peasants grooved in the jousting tournament stands to Queen. There was also the strange attempt to turn Pierre Choderlos de Laclos Dangerous Liaisons from 1782 into an erotic thriller for teens ( the’ 90s were weird, huh? ) via 1999’s Lusty Cruel Intentions.

    However, easily the best of these remains Amy Heckerling’s Clueless ( 1995 ), a pitch perfect transfer of Jane Austen’s Emma from the Regency period to a fairytale version of 1990s Beverly Hills. Cher ( Alicia Silverstone ), a charmed SoCal princess who is so well-intentioned in her matchmaking mischief, defies any attempts to detest her entitlement or vanity by avoiding modern trends and simply inventing her own. You kind of are even low-key chill that the happy ending is she hooks up with her step brother ( Paul Rudd ). It’s a classic!

    And the Rest

    There are many, many more examples we could examine from this era. These can include the sublime, like Winona Ryder, Claire Danes, and Kirsten Dunst in the 1994 film Little Women, or the depressing, like the pathetic in the Demi Moore and Gary Oldman-led The Scarlet Letter ( 1995 ). There were more plays adapted, a la Arthur Miller’s The Crucible ( again with Ryder and Day-Lewis! ), and then those that just had some fun with playwrights, as seen in the <a href=””>over-celebrated Shakespeare in Love ( 1998 ). Mel Gibson even made the appearance of the sword and sandals resurgence in 2000 by going completely medieval ( and ahistorical ) on the costumed drama in Braveheart ( 1995 ).

    More than a few of these won Best Picture Oscars as well, including Braveheart, Shakespeare in Love, and James Cameron’s little 1997 movie you might have heard about elsewhere: Titanic. And yet, by and large, this kind of film has vanished. Once in a while one comes along that still works, such as Greta Gerwig’s own revisionist interpretation of Little Women. That beautiful film was a good-sized hit in 2019, but it did not exactly usher in a new era of literary adaptations.

    These projects are currently largely relegated to long-form stream series, just like everything else that studio bean counters don’t consider to be four-quadrant intellectual property. Which in some cases is fine. The BBC production, which many would argue was the best version of Pride &amp, Prejudice, would be, in my opinion. But whether it is original period piece films or adaptations, unless you’re Robert Eggers ( who arguably isn’t making films for the same mainstream sensibility the likes of Gerwig or, for that matter, Coppola were ), period piece storytelling and “great adaptations” have been abandoned to the small screen and full-on wish fulfillment anachronisms like Bridgerton.

    This seems due to studios increasingly eschewing anything that isn’t reliably based on a brand that middle-aged adults loved. In that case, it might be worthwhile to remind them that children from the 1990s are getting older and having own children. There may again be a market beyond the occasional Gerwig swing, or Eggers take on Dracula, for classic stories, a new audience being raised to want modern riffs inspired by tales that have endured for years and centuries. These tales are primarily published in the public domain. And recent original hits like Sinners suggests you don&#8217, t even need a classic story to connect with audiences. So perhaps once again, a play’s the thing in which they can catch the conscience of the… consumer? Or something similar.

    The post The 1990s Were a Golden Age for Period Piece Movies and Literary Adaptations appeared first on Den of Geek.

  • Love, Death + Robots Producers Reveal the Season 4 Episode Written for Zack Snyder

    Love, Death + Robots Producers Reveal the Season 4 Episode Written for Zack Snyder

    In the third episode of Love, Death + Robots, naked warriors battle atop dinosaurs, mysterious crab invasions, and despotic felines. Netflix’s genre-blending lively book series skillfully highlights technology fiction’s versatility with stories that embrace horror, comedy, melodrama, and another label-defying tales. It’s a uncommon illustration]…]…]…]…]…]]…]] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ] ]

    The article Love, Death + Robots Producers Reveal the Season 4 Episode Written for Zack Snyder appeared initially on Den of Geek.

    Recently a friend mentioned how much of a pity it was that, generally speaking, there are few of those secret” typical” reimaginings now like the ones we had growing up. And after a brief moment of reflection, I agreed. Children and teens of the ‘ 90s were treated to an shame of treasures when it came to the Bard and Bard-adjacent pictures. Almost every week appeared to feature a new development of William Shakespeare, Jane Austen, or Geoffrey Chaucer, all done with a smile and a push to appeal to teenagers who were reading the same writings in higher school or university.

    But then when looking back at the push of 1990s film beyond simply “teen movies”, it was more than just Julia Stiles and Heath Ledger vehicles that were getting the traditional treatment. In fact the ‘ 90s, and to a large extent the ‘ 80s as well, was an era ripe with indie studios and Hollywood majors treating classic literature ( if largely of the English variety ) with the sanctity nowadays reserved for comic books and video games. Some of the most creative or ambitious artists in the industry fought against the sluggishness of New Hollywood from a decade or two earlier in favor of the even more brutal constraints of corsets and top hats.

    cnx. cmd. push ( function ( ) {cnx ( {playerId:” 106e33c0-3911-473c-b599-b1426db57530″, }). render ( “0270c398a82f44f49c23c16122516796” ), }),

    We saw some of the most faithful and enduring adaptations of Austen or Louisa May Alcott making it onto the screen, and Shakespeare was arguably bigger business in tinsel town than at any other point during this time. Why is that and can it happen again? Let’s take a look back in time to the era of period-themed dramas and extravagant literary adaptations…

    Helena Bonham Carter in A Room with a View

    Mozart and Merchant Ivory

    Since the beginning of the medium, moviemakers have looked back at well-worn and familiar stories for inspiration and audience familiarity. In 1907, Georges Méliès adapted Hamlet into a roughly 10-minute silent short after making his enduring trip to the moon. And of course before Kenneth Branagh, Laurence Olivier had Hollywood falling in love with the Bard… at least as long it was Larry in the tights.

    Even so, literary adaptations were frequently constrained, particularly in Hollywood, where directors had to contend with the Hays Code’s restrictions on censorship and preconceived ideas about what American audiences would find appealing. The most popular costumed dramas tended to therefore be vanity projects or something of a more sensational hue—think biblical or swords and sandals epics.

    So it’s difficult to point to an exact moment where that changed in the 1980s, yet we’d hazard to suggest the close together Oscar seasons of 1984 and 1986 had a lot to do with it. After all, James Ivory and Ismail Merchant’s lush adaptation of E. M. Forster’s A Room with a View was the catalyst for our conception of what a” Merchant Ivory” film could be. Milo Forman’s Amadeus won Best Picture in the latter year. Considered by Forster scholars one of the author’s slighter works, the film had critics like Roger Ebert swooning that it was a masterpiece.

    In the case of Amadeus, the director of One Flew Over the Cuckoo’s Nest ( 1975 ), a zeitgeist-shaping portrait of contemporary oppression and control from about a decade earlier, was adapting Mozart’s life story into a punk rock tragicoma. Based on a Peter Shaffer play of the same name, Forman and Shaffer radically reimagined the story, making it both funnier and darker as Forman strove to pose Mozart as a modern day rebel iconoclast with his wig resembling as much Sid Vicious as the Age of Enlightenment. Located atop Tom Hulce’s giggling head, it signaled a movie that had all the trappings of melodrama but felt accessible and exciting to a wide modern audience.

    It then continued to do relatively well and win Best Picture. While not the first period film to do so, it was the first in a long while set in what could be construed as the distant past ( Richard Attenborough’s Gandhi won the year before but that was based on a subject matter in the living memory of most Academy voters ). Otherwise, the majority of recent winners were dramas or dramedies about contemporary life, like Annie Hall ( 1977 ), Kramer vs. Kramer ( 1979 ), and The Deer Hunter ( 1978 ). They reflected an audience that wanted to get away from the artificiality of their parents ‘ cinema, which in the U. S. associated historical costumes with the ( grand ) phoniness of Ben-Hur ( 1959 ) or Oliver! ( 1968 ).

    However, the British masterpiece A Room with a View, which established this was the start of a popular trend, was released a few years later. To be sure, the partnership of Merchant and Ivory had been going for more than 20 years by the time they got to adapting Forster, including with several other costumed dramas and period pieces. However, those movies were paired with contemporary comedies and dramas like Jane Austen in Manhattan in 1980 and The Guru in 1969. More importantly, all of these films tended to be art house pictures, small chamber pieces intended for a limited audience.

    Yet as the marketing campaign would later trumpet about A Room with a View—the ethereal romantic dramedy which introduced Daniel Day-Lewis and a fresh-faced Helena Bonham Carter to the U. S. —this movie had the “highest single theatre gross in the country”! ( It’s fun to recall a time when a movie could be a hit in New York if it were just selling out every day. ) The film’s combination of Forster’s wry satire and cynicism about English aristocracy in the late Victorian and early Edwardian era, coupled with the sweeping romance of Puccini arias and Tuscan countrysides, made it a massive success.

    It also defined what would become the” Merchant Ivory” period piece forever after, including in upcoming Oscar and box office superstars like Howard ( 1992 ), Hopkins and Thompson’s reunion in 1993’s The Remains of the Day, and others. These were all distinctly British and understated pictures, with Remains being an outright tragedy delivered in a hushed whisper, but their relative success with a certain type of moviegoer and Academy voter signaled to Hollywood that there was gold up in’ em hills. And soon enough, more than just Forman on the American side was going up there to mine it.

    Wes Studi in Last of the Mohicans
    20th Century Studios

    Martin Scorsese, Michael Mann, and the Auteur’s Costumed Drama

    One of the most eminently creative people to work in Hollywood was Michael Mann in 1990. As the executive producer and sometime-director on NBC’s edgy ( by’ 80s standards ) police drama, Miami Vice, he played a direct hand in proving American television could be “gritty” and artistic. Even the episodes he didn’t helm were defined by the standards he insisted upon—such as never putting cool guys Crockett and Tubbs in a red or brown car. It would conflict with Mann’s series’ neon-light-on-celluloid aesthetic.

    As that series was winding down by 1990, Mann was more in demand than ever to make any film project he might have wanted—something perhaps in-keeping with Vice or gritty crime thrillers he’d made in the &#8217, 80s like serial killer thriller Manhunter ( 1986 ). Instead, he attempted to adapt James Fenimore Cooper’s 19th-century American frontier novel, The Last of the Mohicans, a childhood favorite for the screen. Certainly a problematic text in its original form with its imperial-fantasy riff on the French and Indian War ( or Seven Years War ) where Indigenous tribes in what is today upstate New York were either reduced to the noble or cruel savage stereotypes, the text proved a jumping off point for Mann to craft a gripping, primal, and prestigious film.

    He also made a movie that far exceeded its source material with The Last of the Mohicans being an often wordless opera of big emotions played in silence by Day-Lewis, Madeleine Stowe, and Wes Studi, all while Trevor Jones and Randy Edelman’s musical score looms like thunderclouds across the mountainous landscape. It is a beautiful drama and an elevated action film, which did more business in the United States than Tom Cruise’s A Few Good Men and Disney’s Beauty and the Beast in the same year. It also would create a precedent we&#8217, d see followed time and again throughout the rest of the decade.

    Some of the most well-known and influential filmmakers of the time, many of whom were praised in auteur theory, were turning to literary classics to appeal to a select group of people. After the one-two genre punch of Goodfellas ( 1990 ) and Cape Fear ( 1991 ), Martin Scorsese made one of his most ambitious and underrated films: a stone-cold 1993 masterpiece inspired by an Edith Wharton novel, The Age of Innocence.

    It’s a story that Scorsese argues is just as brutal, if not more so, than his gangster pictures. The Age of Innocence, in fact, remains the best film adaptation of the Gilded Age in cinema. It captures the lush pageantry of the most wealthy New Yorkers ‘ heyday as well as how class sectarian prejudice developed into ruthless tribalism, which ultimately led to the romantic apprehensions of one conformist attorney (once again Daniel Day-Lewis ) and this would-be-divorced lover ( Michelle Pfeiffer ).

    It might not have been a hit in its time, but Ang Lee’s breakout in the U. S. a year later definitely was. The Taiwanese filmmaker was already a star in international and independent cinema with films like The Wedding Banquet ( 1993 ) and Pushing Hands ( 1991 ), but it was only when he directed a flawless adaptation of Jane Austen’s Sense and Sensibility ( 1995 ), that he became a household name and soon received the greenlight for films like Crouching Tiger, Hidden Dragon ( 2000 ) and Hulk ( 2003 ). Sense and Sensibility benefits greatly, too, from a marvelous cast with Emma Thompson, Hugh Grant, Kate Winslet, and Alan Rickman among its ensemble. It also captured the sophisticated satirical and melancholic underpinnings of Austen’s pen that most previous Hollywood adaptations never scratched.

    It established a standard by which the best Austen adaptations to this day are measured, whether it be Joe Wright and Keira Knightley’s cinematic adaptation of Pride and Prejudice from the 1990s with Gwyneth Paltrow or Netflix’s most recent Persuasion adaptation starring Dakota Johnson.

    Lucy in Bram Stoker's Dracula
    Columbia / Sony

    A Dark Universe of Monsters and Gods

    Meanwhile, right before Columbia Pictures greenlit Scorsese’s The Age of Innocence and later Gillian Armstrong’s still delightful ( and arguably definitive ) interpretation of Little Women in 1994, the same studio signed off on its first period piece with Winona Ryder attached to star. And it was Dracula.

    Bram Stoker’s Dracula was Francis Ford Coppola‘s wacky and magnificent reimagining of Stoker’s definitive Victorian novel, which was then viewed as a fad by rivals who snickeredered at the time by rivals who snickered in reference to a notorious Brian De Palma bomb from 1990. Published in 1897 with on-the-nose metaphors for London society’s anxieties over foreigners, sexual promiscuity and disease, and the so-called” New Woman” working in the professional classes, Coppola saw all of that potential in the well-worn and adapted vampire novel. If he could combine all those elements into an exciting and anachronistic fever dream for the MTV generation, he correctly predicted there would be a box office success.

    Love or hate Coppola’s looseness with Stoker’s novel—which is pretty audacious since he put the author’s name in the title—Coppola crafted one of the most sumptuous and expensive depictions of Victorian society ever put onscreen, winning costume designer Eiko Ishioka an Oscar for the effort. He also made an unexpected holiday hit that played like bloody gangbusters alongside Home Alone 2 and Aladdin that winter.

    It established a standard for what can in retrospect be regarded as a pseudo-“dark universe” of ostensibly savagely Hollywood-adapted classic literary monsters. Coppola himself produced Kenneth Branagh’s Mary Shelley’s Frankenstein ( 1994 ), a film that is actually in many ways closer to the thematic letter of its author than Bram Stoker’s Dracula ever was. It was also a poorer film that failed, but it still looked fantastic as the only major Frankenstein film to bear in mind Shelley’s depiction of the Age of Enlightenment in the late 18th century.

    Yet while Frankenstein failed, Tom Cruise and Neil Jordan would have a lot of success in the same year adapting Anne Rice’s Interview with the Vampire. The book admittedly was recent, having been published in 1976, but the story’s roots and setting in 18th and 19th century bayou occultism were not. In a scene dripping in homoeroticism, the actor who played Top Gun‘s Maverick would stick fangs into a young Brad Pitt’s neck.

    This trend continued throughout the’ 90s with some successes, like Tim Burton’s wildly revisionist ( and Coppola-produced ) Sleepy Hollow in 1999, and some misses. Do you recall, for instance, Julia Roberts playing the not-so-good doctor’s maid in a revisionist version of Robert Louis Stevenson’s The Strange Case of Dr. Jekyll and Mr. Hyde at the height of her stardom? It’s called Mary Reilly ( 1996 ), by the by.

    Denzel Washington and Keanu Reeves in Much Ado About Nothing
    The Samuel Goldwyn Company

    Shakespeare’s Resurrection

    Of course when talking about classic literature and storytelling, one name rises above most others in the schools and curriculums of the English-speaking world. Intriguingly, it was only in the 1990s that someone genuinely sprang up about creating a film that was almost exclusively based on the Bard: Baz Luhrmann, who translated Romeo and Juliet’s tragedy into MTV’s visual language in 1996. He even stylized the title as William Shakespeare’s Romeo + Juliet.

    That proved the tip of an anachronistic iceberg whose cast included Leonardo DiCaprio at the height of his heartthrob powers as Romeo and real-life teenager Claire Danes as his Capulet amore. With hyper music video editing and frenetic neon-hued melodrama, their Verona was a Neverland composite of Miami, Rio de Janeiro, and the nightly news. Some older scholars viewed Luhrmann’s anachronisms as an abomination, but as a Millennial, I can attest we loved this thing back in the day. Many still do so.

    But it was hardly the first box office breakout for Shakespeare in the ‘ 90s. When the decade began, the helmer of another cinematic Romeo and Juliet classic from a different era, Franco Zeffirelli, attempted to make Hamlet exciting for “kids these days” by casting Mel Gibson right in the midst of his Lethal Weapon popularity as the indecisive Dane. It is difficult to recall Gibson’s portrayal of a heartthrob of sorts in the 1980s and early 1990s to the modern eye or as a dashing star deserving of heroic leading man roles.

    Nonetheless, there is quite a bit to like about Hamlet ( 1990 ) if you can look past Gibson’s off-screen behavior in the following decades, or the fact Zeffirelli cuts what is a four-hour play down to less than 2.5 hours. Although Gibson actually makes for a credible and genuinely mad Hamlet, it’s not surprising that Zeffirelli successfully uses production design, costumes, and location shooting at actual Norman castles to convey the medieval melancholy of the story. Plus, Helena Bonham Carter remains the best Ophelia ever put to screen. Hamlet ( 1990 ) would eventually be overshadowed, though, both by Gibson’s awful behavior and because of a much grander and bombastic adaptation from the man who became the King of Shakespeare Movies in the’ 90s: Kenneth Branagh.

    Aye, Branagh might get the most credit for the Shakespearean revival in this era, starting with his 1989 adaptation of Henry V, which featured Derek Jacobi, Brian Blessed, and of course his future wife ( and ex ), Emma Thompson. Together the pair would mount what is in this writer’s opinion the best film ever based on a Shakespeare play, the divine and breezy Much Ado About Nothing ( 1993 ), a perfect encapsulation of perhaps the first romantic comedy ever written that features Branagh and Thompson as the sharp-tongued, dueling lovers Benedict and Beatrice. Additionally, there are performances from Kate Beckinsale in her breakout role, Denzel Washington as a fierce Renaissance prince, and Patrick Doyle’s gloriously over-the-top score.

    It would define the style of Branagh’s following’ 90s efforts, whether they went off-the-rails like in the aforementioned Frankenstein, or right back on them in the 70mm-filmed, ultra wide and sunny adaptation of Hamlet he helmed in 1996. Avoiding the psychological and Freudian interpretations of the Danish prince chased by Olivier and Zeffirelli, Branagh turns Hamlet into a romantic hero spearheading an all-star ensemble cast. Hamlet ( 1996 ) is indulgent at its full four-hour length. Yet somehow that befits the material. In his own directorial effort, Love’s Labour’s Lost ( 2000 ), Branagh would reprise his role as Iago in Laurence Fishburne’s Othello ( 1995 ).

    It paved the way for more outside-the-box Shakespeare movies by the end of the decade like Julie Taymor&#8217, s deconstructionist Titus ( 1999 ) and the A Midsummer Night’s Dream from 1999 where Kevin Kline turns into an ass and makes out with Michelle Pfeiffer.

    Paul Rudd and Alicia Silverstone in Clueless
    CBS via Getty Images

    The Teenage Shakespeare Remix ( and Austen, Chaucer, and others ): The Birth of the…

    As popular as the Shakespeare movie became in the’ 90s, what’s curiously unique about this era is the simultaneous rise of movies that adapted either the Bard or other highly respected literary writers and turned them into a pure teenage dream. We’re discussing whether to modernize Romeo and Juliet like Luhrmann did or to repurpose it for high New York society like Leonard Bernstein and Stephen Sondheim did with West Side Story.

    These were straight, unapologetic youth films that also proved clever reworkings of classic storytelling structure. Among the best directly derived from Shakespeare is the movie that made Julia Stiles and Heath Ledger Gen-X icons, 10 Things I Hate About You ( 1999 ), a happily campy update of The Taming of the Shrew set in a fairytale high school also populated by future Christopher Nolan favorites like Joseph Gordon-Levitt and David Krumholtz. In fact, Tiles would perform this kind of remix a few times in the more serious-faced modernization of Hamlet ( 2000 ), the third Hamlet film in ten years, but this one set in turn-of-the-century NYC, and the more serious-faced modernization of Othello, O ( 2000 ), which also featured Mekhi Phifer as a tragic distrusting high school sports star instead of warrior.

    Ledger also returned to the concept by adapting another, even older literary giant, in this case the medieval poet Geoffrey Chaucer, for A Knight’s Tale ( 2001 ), an anachronistic blending of the medieval and modern where peasants grooved in the jousting tournament stands to Queen. There was also the odd attempt to make Dangerous Liaisons by Pierre Choderlos de Laclos from 1782 an erotic thriller for teenagers ( the 1990s were weird, huh? )? via <a href=””>the lusty Cruel Intentions ( 1999 ).

    However, easily the best of these remains Amy Heckerling’s Clueless ( 1995 ), a pitch perfect transfer of Jane Austen’s Emma from the Regency period to a fairytale version of 1990s Beverly Hills. Cher ( Alicia Silverstone ), a charmed SoCal princess who is so well-intentioned in her matchmaking mischief, defies any attempts to detest her entitlement or vanity by avoiding modern fads and simply inventing her own. With the assumption that anything she wrote in 1994 would be dated by ’95, she creates a faux yet now authentically iconic language and fashion style via Cher ( Alicia Silverstone ). You kind of are even low-key chill that the happy ending is she hooks up with her step brother ( Paul Rudd ). It’s a classic indeed!

    And the Rest

    There are many, many more examples we could examine from this era. These can include the sublime, like the Winona Ryder, Claire Danes, and Kirsten Dunst-starred, Gillian Armstrong-directed Little Women of 1994, and they can also include the wretched, like the pathetic The Scarlet Letter ( 1995 ) starring Demi Moore and Gary Oldman. There were more plays adapted, a la Arthur Miller’s The Crucible ( again with Ryder and Day-Lewis! ), and then those who simply enjoyed playing some playwrights, as depicted in the overly clocked Shakespeare in Love ( 2000 ). The inklings of the sword and sandals return in 2000 was even hinted at by Mel Gibson going full medieval ( and ahistorical ) on the costumed drama in Braveheart ( 1995 ).

    More than a few of these won Best Picture Oscars as well, including Braveheart, Shakespeare in Love, and James Cameron’s little 1997 movie you might have heard about elsewhere: Titanic. However, this kind of film has largely vanished. Once in a while one comes along that still works, such as Greta Gerwig’s own revisionist interpretation of Little Women. In 2019, that stunning movie was a big hit, but it did not exactly usher in a new era of literary adaptations.

    Now such projects, like everything else not considered four-quadrant intellectual property by studio bean counters, is mostly relegated to long-form stream series. Which in some cases is fine. The BBC production, which was also from the 1990s, is arguably the best version of Pride & Prejudice, in my opinion. But whether it is original period piece films or adaptations, unless you’re Robert Eggers ( who arguably isn’t making films for the same mainstream sensibility the likes of Gerwig or, for that matter, Coppola were ), period piece storytelling and “great adaptations” have been abandoned to the small screen and full-on wish fulfillment anachronisms like Bridgerton.

    This appears to be a result of studios increasingly skipping anything that isn’t consistently based on a popular brand. But in that case … it might be worth reminding them that’ 90s kids are getting older and having children of their own. There may again be a market beyond the occasional Gerwig swing, or Eggers take on Dracula, for classic stories, a new audience being raised to want modern riffs inspired by tales that have endured for years and centuries. These tales are primarily in the public domain. And recent original hits like Sinners suggests you don&#8217, t even need a classic story to connect with audiences. Perhaps a play is the catalyst for the conscientiousness of the consumer once more? Or something like that.

    The post The 1990s Were a Golden Age for Period Piece Movies and Literary Adaptations appeared first on Den of Geek.

  • The 1990s Were a Golden Age for Period Piece Movies and Literary Adaptations

    The 1990s Were a Golden Age for Period Piece Movies and Literary Adaptations

    Recently a friend mentioned how much of a shame it was that, generally speaking, there are few of those backdoor “classic” reimaginings today like the ones we had growing up. And after thinking for a moment, I agreed. Children and teens of the ‘90s were treated to an embarrassment of riches when it came to […]

    The post The 1990s Were a Golden Age for Period Piece Movies and Literary Adaptations appeared first on Den of Geek.

    Recently a friend mentioned how much of a shame it was that, generally speaking, there are few of those backdoor “classic” reimaginings today like the ones we had growing up. And after thinking for a moment, I agreed. Children and teens of the ‘90s were treated to an embarrassment of riches when it came to the Bard and Bard-adjacent films. Nearly every week seemed to offer another modernization of William Shakespeare, Jane Austen, or Geoffrey Chaucer, all retrofitted with a wink and a nudge to appeal to teenagers reading much the same texts in high school or university.

    But then when looking back at the sweep of 1990s cinema beyond just “teen movies,” it was more than only Julia Stiles and Heath Ledger vehicles that were getting the classical treatment. In fact the ‘90s, and to a large extent the ‘80s as well, was an era ripe with indie studios and Hollywood majors treating classic literature (if largely of the English variety) with the sanctity nowadays reserved for comic books and video games. It was a time when some of the most exciting or ambitious artists working in the industry sought to trade in the bullets and brutality of New Hollywood from a decade or two earlier in favor of the even more brutal constraints of corsets and top hats.

    cnx.cmd.push(function() {
    cnx({
    playerId: “106e33c0-3911-473c-b599-b1426db57530”,

    }).render(“0270c398a82f44f49c23c16122516796”);
    });

    Shakespeare was arguably bigger business in tinsel town than at any other point during this period, and we saw some of the most faithful and enduring adaptations of Austen or Louisa May Alcott make it to the screen. Why is that and can it happen again? Let’s look back at the golden age of period piece costumed dramas and splashy literary adaptations…

    Helena Bonham Carter in A Room with a View

    Mozart and Merchant Ivory

    Since the beginning of the medium, moviemakers have looked back at well-worn and familiar stories for inspiration and audience familiarity. Not too many years after making his enduring trip to the moon, Georges Méliès adapted Hamlet into a roughly 10-minute silent short in 1907. And of course before Kenneth Branagh, Laurence Olivier had Hollywood falling in love with the Bard… at least as long it was Larry in the tights.

    Even so, literary adaptations were often constrained, particularly in Hollywood where filmmakers had to contend with the limitations of censorship via the Hays Code and preconceived notions about what an American audience would enjoy. The most popular costumed dramas tended to therefore be vanity projects or something of a more sensational hue—think biblical or swords and sandals epics.

    So it’s difficult to point to an exact moment where that changed in the 1980s, yet we’d hazard to suggest the close together Oscar seasons of 1984 and 1986 had a lot to do with it. After all, the first was the year that Miloš Forman’s Amadeus won Best Picture, and the latter was the year that our conception of what a “Merchant Ivory” film could be was cemented by James Ivory and Ismail Merchant’s luscious adaptation of E.M. Forster’s A Room with a View. Considered by Forster scholars one of the author’s slighter works, the film had critics like Roger Ebert swooning that it was a masterpiece.

    In the case of Amadeus, the director of One Flew Over the Cuckoo’s Nest (1975)—a zeitgeist-shaping portrait of modern oppression and control from about a decade earlier—was taking the story of Mozart and making it a punk rock tragicomedy. Based on a Peter Shaffer play of the same name, Forman and Shaffer radically reimagined the story, making it both funnier and darker as Forman strove to pose Mozart as a modern day rebel iconoclast with his wig resembling as much Sid Vicious as the Age of Enlightenment. Located atop Tom Hulce’s giggling head, it signaled a movie that had all the trappings of melodrama but felt accessible and exciting to a wide modern audience.

    It went on to do relatively big business and win Best Picture. While not the first period film to do so, it was the first in a long while set in what could be construed as the distant past (Richard Attenborough’s Gandhi won the year before but that was based on a subject matter in the living memory of most Academy voters). Otherwise, most of the recent winners were dramas or dramedies about the modern world: Kramer vs. Kramer (1979), The Deer Hunter (1978), and Annie Hall (1977). They reflected an audience that wanted to get away from the artificiality of their parents’ cinema, which in the U.S. associated historical costumes with the (grand) phoniness of Ben-Hur (1959) or Oliver! (1968).

    Yet perhaps the movie that proved this was the beginning of a popular trend came a few years later via the British masterpiece A Room with a View. To be sure, the partnership of Merchant and Ivory had been going for more than 20 years by the time they got to adapting Forster, including with several other costumed dramas and period pieces. However, those films were mixed with modern comedies and dramas like rock ’n roll-infused The Guru (1969) and Jane Austen in Manhattan (1980). More importantly, all of these films tended to be art house pictures; small chamber pieces intended for a limited audience.

    Yet as the marketing campaign would later trumpet about A Room with a View—the ethereal romantic dramedy which introduced Daniel Day-Lewis and a fresh-faced Helena Bonham Carter to the U.S.—this movie had the “highest single theatre gross in the country!” (It’s fun to remember a time when a movie just selling out in New York every day could make it a hit.) The film’s combination of Forster’s wry satire and cynicism about English aristocracy in the late Victorian and early Edwardian era, coupled with the sweeping romance of Puccini arias and Tuscan countrysides, made it a massive success.

    It also defined what became the “Merchant Ivory” period piece forever after, including in future Oscar and box office darlings like the Anthony Hopkins, Emma Thompson, and Carter-starring Howard’s End (1992), and Hopkins and Thompson’s reunion in The Remains of the Day (1993). These were all distinctly British and understated pictures, with Remains being an outright tragedy delivered in a hushed whisper, but their relative success with a certain type of moviegoer and Academy voter signaled to Hollywood that there was gold up in ‘em hills. And soon enough, more than just Forman on the American side was going up there to mine it.

    Wes Studi in Last of the Mohicans
    20th Century Studios

    Martin Scorsese, Michael Mann, and the Auteur’s Costumed Drama

    In 1990, Michael Mann was one of the hottest creatives working in Hollywood. As the executive producer and sometime-director on NBC’s edgy (by ‘80s standards) police drama, Miami Vice, he played a direct hand in proving American television could be “gritty” and artistic. Even the episodes he didn’t helm were defined by the standards he insisted upon—such as never putting cool guys Crockett and Tubbs in a red or brown car. It would clash with the neon-light-on-celluloid aesthetic that Mann developed for the series.

    As that series was winding down by 1990, Mann was more in demand than ever to make any film project he might have wanted—something perhaps in-keeping with Vice or gritty crime thrillers he’d made in the ’80s like serial killer thriller Manhunter (1986). Instead he sought to adapt a childhood favorite for the screen, James Fenimore Cooper’s 19th century American frontier novel, The Last of the Mohicans. Certainly a problematic text in its original form with its imperial-fantasy riff on the French and Indian War (or Seven Years War) where Indigenous tribes in what is today upstate New York were either reduced to the noble or cruel savage stereotypes, the text proved a jumping off point for Mann to craft a gripping, primal, and prestigious film.

    He also made a movie that far exceeded its source material with The Last of the Mohicans being an often wordless opera of big emotions played in silence by Day-Lewis, Madeleine Stowe, and Wes Studi, all while Trevor Jones and Randy Edelman’s musical score looms like thunderclouds across the mountainous landscape. It is an elevated action movie, and a beautiful drama that did bigger business in the U.S. than Disney’s Beauty and the Beast and Tom Cruise vehicle A Few Good Men in the same year. It also would create a precedent we’d see followed time and again throughout the rest of the decade.

    Some of the biggest and most respected filmmakers of the moment, many of them praised under auteur theory, were looking to literary classics for an audience that craved them. After the one-two genre punch of Goodfellas (1990) and Cape Fear (1991), Martin Scorsese made one of his most ambitious and underrated films: a stone-cold 1993 masterpiece inspired by an Edith Wharton novel, The Age of Innocence.

    It’s a story that Scorsese argues is just as brutal, if not more so, than his gangster pictures. Indeed, The Age of Innocence remains the best cinematic representation of the Gilded Age in the U.S., capturing the lush pageantry of the most elite New Yorkers’ lifestyles in their robber baron heyday, as well as how class snobbery metastasized into a ruthless tribalism that doomed the romantic yearnings of one conformist attorney (again Daniel Day-Lewis) and this would-be divorcée love of his life (Michelle Pfeiffer).

    It might not have been a hit in its time, but Ang Lee’s breakout in the U.S. a year later definitely was. The Taiwanese filmmaker was already the toast of international and independent cinema via movies like The Wedding Banquet (1993) and martial arts-adjacent Pushing Hands (1991), but it is when he directed a flawless adaptation of Jane Austen’s Sense and Sensibility in 1995 that he became a Hollywood favorite who would soon get movies like Crouching Tiger, Hidden Dragon (2000) and Hulk (2003) greenlit. Sense and Sensibility benefits greatly, too, from a marvelous cast with Emma Thompson, Hugh Grant, Kate Winslet, and Alan Rickman among its ensemble. It also captured the sophisticated satirical and melancholic underpinnings of Austen’s pen that most previous Hollywood adaptations never scratched.

    It set a standard that most of the best Austen adaptations to this day are measured by, be it Joe Wright and Keira Knightley’s cinematic take on Pride and Prejudice a decade later, various attempts at Emma from the 1990s with Gwyneth Paltrow to this decade with Anya Taylor-Joy, or even Netflix’s recent Dakota Johnson-led Persuasion adaptation.

    Lucy in Bram Stoker's Dracula
    Columbia / Sony

    A Dark Universe of Gods and Monsters

    Meanwhile, right before Columbia Pictures greenlit Scorsese’s The Age of Innocence and later Gillian Armstrong’s still delightful (and arguably definitive) interpretation of Little Women in 1994, the same studio signed off on its first period piece with Winona Ryder attached to star. And it was Dracula.

    Considered a folly of hubris at the time by rivals who snickered to Variety it should be renamed “Bonfire of the Vampires” (in reference to a notorious Brian De Palma bomb from 1990), Bram Stoker’s Dracula was Francis Ford Coppola’s lurid and magnificent reimagining of Stoker’s definitive Victorian novel. Published in 1897 with on-the-nose metaphors for London society’s anxieties over foreigners, sexual promiscuity and disease, and the so-called “New Woman” working in the professional classes, Coppola saw all of that potential in the well-worn and adapted vampire novel. He also correctly predicted there was a box office hit if he could bring all those elements out in an exciting and anachronistic fever dream for the MTV generation.

    Love or hate Coppola’s looseness with Stoker’s novel—which is pretty audacious since he put the author’s name in the title—Coppola crafted one of the most sumptuous and expensive depictions of Victorian society ever put onscreen, winning costume designer Eiko Ishioka an Oscar for the effort. He also made an unexpected holiday hit that played like bloody gangbusters alongside Home Alone 2 and Aladdin that winter.

    It set a standard for what can in retrospect be considered a pseudo “dark universe” of classic literary monsters getting ostensibly faithful and expensive adaptations by Hollywood. Coppola himself produced Kenneth Branagh’s Mary Shelley’s Frankenstein (1994), a film that is actually in many ways closer to the thematic letter of its author than Bram Stoker’s Dracula ever was. It was also a worse movie that flopped, but it looked spectacular as the only major Frankenstein movie to remember Shelley set the story during the Age of Enlightenment in the late 18th century.

    Yet while Frankenstein failed, Tom Cruise and Neil Jordan would have a lot of success in the same year adapting Anne Rice’s Interview with the Vampire. The book admittedly was recent, having been published in 1976, but the story’s roots and setting in 18th and 19th century bayou occultism were not. It was also a grandiose costumed drama where the guy who played Top Gun’s Maverick would sink fangs into young Brad Pitt’s neck in a scene dripping in homoeroticism.

    This trend continued throughout the ‘90s with some successes, like Tim Burton’s wildly revisionist (and Coppola-produced) Sleepy Hollow in 1999, and some misses. For instance, did you remember that Julia Roberts at the height of her stardom appeared in a revisionist take on Robert Louis Stevenson’s The Strange Case of Dr. Jekyll and Mr. Hyde where she played the not-so-good doctor’s maid? It’s called Mary Reilly (1996), by the by.

    Denzel Washington and Keanu Reeves in Much Ado About Nothing
    The Samuel Goldwyn Company

    The Resurgence of Shakespeare

    Of course when talking about classic literature and storytelling, one name rises above most others in the schools and curriculums of the English-speaking world. Yet curiously it was only in the 1990s that someone really lit on the idea of making a movie directly based on the Bard tailored almost exclusively for that demographic: Baz Luhrmann in 1996, who reconfigured the tragedy of Romeo and Juliet into the visual language of MTV. He even stylized the title as William Shakespeare’s Romeo + Juliet.

    That proved the tip of an anachronistic iceberg whose cast included Leonardo DiCaprio at the height of his heartthrob powers as Romeo and real-life teenager Claire Danes as his Capulet amore. Their Verona was a Neverland composite of Miami, Rio de Janeiro, and the nightly news, with hyper music video editing and frenetic neon-hued melodrama. Some older scholars viewed Luhrmann’s anachronisms as an abomination, but as a Millennial, I can attest we loved this thing back in the day. Many still do.

    But it was hardly the first box office breakout for Shakespeare in the ‘90s. When the decade began, the helmer of another cinematic Romeo and Juliet classic from a different era, Franco Zeffirelli, attempted to make Hamlet exciting for “kids these days” by casting Mel Gibson right in the midst of his Lethal Weapon popularity as the indecisive Dane. To the modern eye, it is hard to remember Gibson was a heartthrob of sorts in the ‘80s and early ‘90s—or generally viewed as a dashing star worthy of heroic leading men roles.

    Nonetheless, there is quite a bit to like about Hamlet (1990) if you can look past Gibson’s off-screen behavior in the following decades, or the fact Zeffirelli cuts what is a four-hour play down to less than 2.5 hours. Gibson actually makes for a credible and genuinely mad Hamlet (perhaps not a surprise now), and Zeffirelli mines the medieval melancholy of the story well with production design, costumes, and location shooting at real Norman castles. Plus, Helena Bonham Carter remains the best Ophelia ever put to screen. Hamlet (1990) would eventually be overshadowed, though, both by Gibson’s awful behavior and because of a much grander and bombastic adaptation from the man who became the King of Shakespeare Movies in the ‘90s: Kenneth Branagh.

    Aye, Branagh might deserve the most credit for the Shakespearean renaissance in this era, beginning with his adaptation of Henry V (1989), which featured the makings of Branagh’s troupe of former RSC favorites turned film actors: Derek Jacobi, Brian Blessed, and of course his future wife (and ex), Emma Thompson. Together the pair would mount what is in this writer’s opinion the best film ever based on a Shakespeare play, the divine and breezy Much Ado About Nothing (1993), a perfect encapsulation of perhaps the first romantic comedy ever written that features Branagh and Thompson as the sharp-tongued, dueling lovers Benedict and Beatrice. It also features Denzel Washington as a dashing Renaissance prince, Kate Beckinsale in her breakout role, and a gloriously over-the-top score by Patrick Doyle.

    It would define the style of Branagh’s following ‘90s efforts, whether they went off-the-rails like in the aforementioned Frankenstein, or right back on them in the 70mm-filmed, ultra wide and sunny adaptation of Hamlet he helmed in 1996. Avoiding the psychological and Freudian interpretations of the Danish prince chased by Olivier and Zeffirelli, Branagh turns Hamlet into a romantic hero spearheading an all-star ensemble cast. At the play’s full four-hour length, Hamlet (1996) is indulgent. Yet somehow that befits the material. Branagh would also star as Iago in Oliver Parker’s Othello (1995) opposite Laurence Fishburne and reconfigure the Bard as a musical in his own directorial effort, Love’s Labour’s Lost (2000).

    It paved the way for more outside-the-box Shakespeare movies by the end of the decade like Julie Taymor’s deconstructionist Titus (1999) and the A Midsummer Night’s Dream from 1999 where Kevin Kline turns into an ass and makes out with Michelle Pfeiffer.

    Paul Rudd and Alicia Silverstone in Clueless
    CBS via Getty Images

    The Birth of the Teenage Shakespeare Remix (and Austen, and Chaucer, and…)

    As popular as the Shakespeare movie became in the ‘90s, what’s curiously unique about this era is the simultaneous rise of movies that adapted either the Bard or other highly respected literary writers and turned them into a pure teenage dream. We’re talking moving past modernizing Romeo and Juliet like Luhrmann did, or repurposing it for high New York society like Leonard Bernstein and Stephen Sondheim aimed with West Side Story.

    These were straight, unapologetic youth films that also proved clever reworkings of classic storytelling structure. Among the best directly derived from Shakespeare is the movie that made Julia Stiles and Heath Ledger Gen-X icons, 10 Things I Hate About You (1999), a happily campy update of The Taming of the Shrew set in a fairytale high school also populated by future Christopher Nolan favorites like Joseph Gordon-Levitt and David Krumholtz. Stiles would, in fact, do this kind of remix a number times in the more serious-faced modernization of Othello, O (2000), which also starred Mekhi Phifer as a tragically distrusting high school sports star instead of warrior, and Michael Almereyda and Ethan Hawke’s own Hamlet (2000), the third Hamlet movie in 10 years, albeit this one set in turn-of-the-century NYC.

    Ledger also returned to the concept by adapting another, even older literary giant, in this case the medieval poet Geoffrey Chaucer, for A Knight’s Tale (2001), an anachronistic blending of the medieval and modern where peasants grooved in the jousting tournament stands to Queen. There was also the strange attempt to turn Pierre Choderlos de Laclos’ Dangerous Liaisons from 1782 into an erotic thriller for teens (the ‘90s were weird, huh?) via the lusty Cruel Intentions (1999).

    However, easily the best of these remains Amy Heckerling’s Clueless (1995), a pitch perfect transfer of Jane Austen’s Emma from the Regency period to a fairytale version of 1990s Beverly Hills. Foregoing modern fads and simply inventing her own—with the assumption anything she wrote in 1994 would be dated by ’95—Heckerling create a faux yet now authentically iconic language and fashion style via Cher (Alicia Silverstone), a charmed SoCal princess who is so well-meaning in her matchmaking mischief that she defies any attempts to detest her entitlement or vanity. You kind of are even low-key chill that the happy ending is she hooks up with her step brother (Paul Rudd). It’s a classic!

    And the Rest

    There are many, many more examples we could examine from this era. These can include the sublime like the Gillian Armstrong-directed Little Women of 1994 starring Winona Ryder, Claire Danes, and Kirsten Dunst; and they can include the wretched like the Demi Moore and Gary Oldman-led The Scarlet Letter (1995). There were more plays adapted, a la Arthur Miller’s The Crucible (again with Ryder and Day-Lewis!), and then those that just had some fun with playwrights, as seen in the over-celebrated Shakespeare in Love (1998). The inklings of the sword and sandals’ return in 2000 was even hinted at by Mel Gibson going full medieval (and ahistorical) on the costumed drama in Braveheart (1995).

    More than a few of these won Best Picture Oscars as well, including Braveheart, Shakespeare in Love, and James Cameron’s little 1997 movie you might have heard about elsewhere: Titanic. And yet, this type of film has by and large gone away. Once in a while one comes along that still works, such as Greta Gerwig’s own revisionist interpretation of Little Women. That beautiful film was a good-sized hit in 2019, but it did not exactly usher in a new era of literary adaptations.

    Now such projects, like everything else not considered four-quadrant intellectual property by studio bean counters, is mostly relegated to long-form stream series. Which in some cases is fine. Many would argue the best version of Pride & Prejudice was the BBC production… also from the ‘90s, mind. But whether it is original period piece films or adaptations, unless you’re Robert Eggers (who arguably isn’t making films for the same mainstream sensibility the likes of Gerwig or, for that matter, Coppola were), period piece storytelling and “great adaptations” have been abandoned to the small screen and full-on wish fulfillment anachronisms like Bridgerton.

    This seems due to studios increasingly eschewing anything that isn’t reliably based on a brand that middle-aged adults loved. But in that case… it might be worth reminding them that ‘90s kids are getting older and having children of their own. There may again be a market beyond the occasional Gerwig swing, or Eggers take on Dracula, for classic stories; a new audience being raised to want modern riffs inspired by tales that have endured for years and centuries. These stories are mostly in the public domain too. And recent original hits like Sinners suggests you don’t even need a classic story to connect with audiences. So perhaps once again, a play’s the thing in which they can catch the conscience of the… consumer? Or something like that.

    The post The 1990s Were a Golden Age for Period Piece Movies and Literary Adaptations appeared first on Den of Geek.

  • Death of a Video Game Console: How Each Generation Said Goodbye

    Death of a Video Game Console: How Each Generation Said Goodbye

    Nintendo closes out the end of an era in 2025 with the introduction of the Nintendo Switch 2 and gradual sunsetting of the original Nintendo Switch. This shift in focus to the new console won’t be overnight, of course, and rarely is whenever console publishers transition to a fresh generation. The first Switch generation was […]

    The post Death of a Video Game Console: How Each Generation Said Goodbye appeared first on Den of Geek.

    Recently a friend mentioned how much of a shame it was that, generally speaking, there are few of those backdoor “classic” reimaginings today like the ones we had growing up. And after thinking for a moment, I agreed. Children and teens of the ‘90s were treated to an embarrassment of riches when it came to the Bard and Bard-adjacent films. Nearly every week seemed to offer another modernization of William Shakespeare, Jane Austen, or Geoffrey Chaucer, all retrofitted with a wink and a nudge to appeal to teenagers reading much the same texts in high school or university.

    But then when looking back at the sweep of 1990s cinema beyond just “teen movies,” it was more than only Julia Stiles and Heath Ledger vehicles that were getting the classical treatment. In fact the ‘90s, and to a large extent the ‘80s as well, was an era ripe with indie studios and Hollywood majors treating classic literature (if largely of the English variety) with the sanctity nowadays reserved for comic books and video games. It was a time when some of the most exciting or ambitious artists working in the industry sought to trade in the bullets and brutality of New Hollywood from a decade or two earlier in favor of the even more brutal constraints of corsets and top hats.

    cnx.cmd.push(function() {
    cnx({
    playerId: “106e33c0-3911-473c-b599-b1426db57530”,

    }).render(“0270c398a82f44f49c23c16122516796”);
    });

    Shakespeare was arguably bigger business in tinsel town than at any other point during this period, and we saw some of the most faithful and enduring adaptations of Austen or Louisa May Alcott make it to the screen. Why is that and can it happen again? Let’s look back at the golden age of period piece costumed dramas and splashy literary adaptations…

    Helena Bonham Carter in A Room with a View

    Mozart and Merchant Ivory

    Since the beginning of the medium, moviemakers have looked back at well-worn and familiar stories for inspiration and audience familiarity. Not too many years after making his enduring trip to the moon, Georges Méliès adapted Hamlet into a roughly 10-minute silent short in 1907. And of course before Kenneth Branagh, Laurence Olivier had Hollywood falling in love with the Bard… at least as long it was Larry in the tights.

    Even so, literary adaptations were often constrained, particularly in Hollywood where filmmakers had to contend with the limitations of censorship via the Hays Code and preconceived notions about what an American audience would enjoy. The most popular costumed dramas tended to therefore be vanity projects or something of a more sensational hue—think biblical or swords and sandals epics.

    So it’s difficult to point to an exact moment where that changed in the 1980s, yet we’d hazard to suggest the close together Oscar seasons of 1984 and 1986 had a lot to do with it. After all, the first was the year that Miloš Forman’s Amadeus won Best Picture, and the latter was the year that our conception of what a “Merchant Ivory” film could be was cemented by James Ivory and Ismail Merchant’s luscious adaptation of E.M. Forster’s A Room with a View. Considered by Forster scholars one of the author’s slighter works, the film had critics like Roger Ebert swooning that it was a masterpiece.

    In the case of Amadeus, the director of One Flew Over the Cuckoo’s Nest (1975)—a zeitgeist-shaping portrait of modern oppression and control from about a decade earlier—was taking the story of Mozart and making it a punk rock tragicomedy. Based on a Peter Shaffer play of the same name, Forman and Shaffer radically reimagined the story, making it both funnier and darker as Forman strove to pose Mozart as a modern day rebel iconoclast with his wig resembling as much Sid Vicious as the Age of Enlightenment. Located atop Tom Hulce’s giggling head, it signaled a movie that had all the trappings of melodrama but felt accessible and exciting to a wide modern audience.

    It went on to do relatively big business and win Best Picture. While not the first period film to do so, it was the first in a long while set in what could be construed as the distant past (Richard Attenborough’s Gandhi won the year before but that was based on a subject matter in the living memory of most Academy voters). Otherwise, most of the recent winners were dramas or dramedies about the modern world: Kramer vs. Kramer (1979), The Deer Hunter (1978), and Annie Hall (1977). They reflected an audience that wanted to get away from the artificiality of their parents’ cinema, which in the U.S. associated historical costumes with the (grand) phoniness of Ben-Hur (1959) or Oliver! (1968).

    Yet perhaps the movie that proved this was the beginning of a popular trend came a few years later via the British masterpiece A Room with a View. To be sure, the partnership of Merchant and Ivory had been going for more than 20 years by the time they got to adapting Forster, including with several other costumed dramas and period pieces. However, those films were mixed with modern comedies and dramas like rock ’n roll-infused The Guru (1969) and Jane Austen in Manhattan (1980). More importantly, all of these films tended to be art house pictures; small chamber pieces intended for a limited audience.

    Yet as the marketing campaign would later trumpet about A Room with a View—the ethereal romantic dramedy which introduced Daniel Day-Lewis and a fresh-faced Helena Bonham Carter to the U.S.—this movie had the “highest single theatre gross in the country!” (It’s fun to remember a time when a movie just selling out in New York every day could make it a hit.) The film’s combination of Forster’s wry satire and cynicism about English aristocracy in the late Victorian and early Edwardian era, coupled with the sweeping romance of Puccini arias and Tuscan countrysides, made it a massive success.

    It also defined what became the “Merchant Ivory” period piece forever after, including in future Oscar and box office darlings like the Anthony Hopkins, Emma Thompson, and Carter-starring Howard’s End (1992), and Hopkins and Thompson’s reunion in The Remains of the Day (1993). These were all distinctly British and understated pictures, with Remains being an outright tragedy delivered in a hushed whisper, but their relative success with a certain type of moviegoer and Academy voter signaled to Hollywood that there was gold up in ‘em hills. And soon enough, more than just Forman on the American side was going up there to mine it.

    Wes Studi in Last of the Mohicans
    20th Century Studios

    Martin Scorsese, Michael Mann, and the Auteur’s Costumed Drama

    In 1990, Michael Mann was one of the hottest creatives working in Hollywood. As the executive producer and sometime-director on NBC’s edgy (by ‘80s standards) police drama, Miami Vice, he played a direct hand in proving American television could be “gritty” and artistic. Even the episodes he didn’t helm were defined by the standards he insisted upon—such as never putting cool guys Crockett and Tubbs in a red or brown car. It would clash with the neon-light-on-celluloid aesthetic that Mann developed for the series.

    As that series was winding down by 1990, Mann was more in demand than ever to make any film project he might have wanted—something perhaps in-keeping with Vice or gritty crime thrillers he’d made in the ’80s like serial killer thriller Manhunter (1986). Instead he sought to adapt a childhood favorite for the screen, James Fenimore Cooper’s 19th century American frontier novel, The Last of the Mohicans. Certainly a problematic text in its original form with its imperial-fantasy riff on the French and Indian War (or Seven Years War) where Indigenous tribes in what is today upstate New York were either reduced to the noble or cruel savage stereotypes, the text proved a jumping off point for Mann to craft a gripping, primal, and prestigious film.

    He also made a movie that far exceeded its source material with The Last of the Mohicans being an often wordless opera of big emotions played in silence by Day-Lewis, Madeleine Stowe, and Wes Studi, all while Trevor Jones and Randy Edelman’s musical score looms like thunderclouds across the mountainous landscape. It is an elevated action movie, and a beautiful drama that did bigger business in the U.S. than Disney’s Beauty and the Beast and Tom Cruise vehicle A Few Good Men in the same year. It also would create a precedent we’d see followed time and again throughout the rest of the decade.

    Some of the biggest and most respected filmmakers of the moment, many of them praised under auteur theory, were looking to literary classics for an audience that craved them. After the one-two genre punch of Goodfellas (1990) and Cape Fear (1991), Martin Scorsese made one of his most ambitious and underrated films: a stone-cold 1993 masterpiece inspired by an Edith Wharton novel, The Age of Innocence.

    It’s a story that Scorsese argues is just as brutal, if not more so, than his gangster pictures. Indeed, The Age of Innocence remains the best cinematic representation of the Gilded Age in the U.S., capturing the lush pageantry of the most elite New Yorkers’ lifestyles in their robber baron heyday, as well as how class snobbery metastasized into a ruthless tribalism that doomed the romantic yearnings of one conformist attorney (again Daniel Day-Lewis) and this would-be divorcée love of his life (Michelle Pfeiffer).

    It might not have been a hit in its time, but Ang Lee’s breakout in the U.S. a year later definitely was. The Taiwanese filmmaker was already the toast of international and independent cinema via movies like The Wedding Banquet (1993) and martial arts-adjacent Pushing Hands (1991), but it is when he directed a flawless adaptation of Jane Austen’s Sense and Sensibility in 1995 that he became a Hollywood favorite who would soon get movies like Crouching Tiger, Hidden Dragon (2000) and Hulk (2003) greenlit. Sense and Sensibility benefits greatly, too, from a marvelous cast with Emma Thompson, Hugh Grant, Kate Winslet, and Alan Rickman among its ensemble. It also captured the sophisticated satirical and melancholic underpinnings of Austen’s pen that most previous Hollywood adaptations never scratched.

    It set a standard that most of the best Austen adaptations to this day are measured by, be it Joe Wright and Keira Knightley’s cinematic take on Pride and Prejudice a decade later, various attempts at Emma from the 1990s with Gwyneth Paltrow to this decade with Anya Taylor-Joy, or even Netflix’s recent Dakota Johnson-led Persuasion adaptation.

    Lucy in Bram Stoker's Dracula
    Columbia / Sony

    A Dark Universe of Gods and Monsters

    Meanwhile, right before Columbia Pictures greenlit Scorsese’s The Age of Innocence and later Gillian Armstrong’s still delightful (and arguably definitive) interpretation of Little Women in 1994, the same studio signed off on its first period piece with Winona Ryder attached to star. And it was Dracula.

    Considered a folly of hubris at the time by rivals who snickered to Variety it should be renamed “Bonfire of the Vampires” (in reference to a notorious Brian De Palma bomb from 1990), Bram Stoker’s Dracula was Francis Ford Coppola’s lurid and magnificent reimagining of Stoker’s definitive Victorian novel. Published in 1897 with on-the-nose metaphors for London society’s anxieties over foreigners, sexual promiscuity and disease, and the so-called “New Woman” working in the professional classes, Coppola saw all of that potential in the well-worn and adapted vampire novel. He also correctly predicted there was a box office hit if he could bring all those elements out in an exciting and anachronistic fever dream for the MTV generation.

    Love or hate Coppola’s looseness with Stoker’s novel—which is pretty audacious since he put the author’s name in the title—Coppola crafted one of the most sumptuous and expensive depictions of Victorian society ever put onscreen, winning costume designer Eiko Ishioka an Oscar for the effort. He also made an unexpected holiday hit that played like bloody gangbusters alongside Home Alone 2 and Aladdin that winter.

    It set a standard for what can in retrospect be considered a pseudo “dark universe” of classic literary monsters getting ostensibly faithful and expensive adaptations by Hollywood. Coppola himself produced Kenneth Branagh’s Mary Shelley’s Frankenstein (1994), a film that is actually in many ways closer to the thematic letter of its author than Bram Stoker’s Dracula ever was. It was also a worse movie that flopped, but it looked spectacular as the only major Frankenstein movie to remember Shelley set the story during the Age of Enlightenment in the late 18th century.

    Yet while Frankenstein failed, Tom Cruise and Neil Jordan would have a lot of success in the same year adapting Anne Rice’s Interview with the Vampire. The book admittedly was recent, having been published in 1976, but the story’s roots and setting in 18th and 19th century bayou occultism were not. It was also a grandiose costumed drama where the guy who played Top Gun’s Maverick would sink fangs into young Brad Pitt’s neck in a scene dripping in homoeroticism.

    This trend continued throughout the ‘90s with some successes, like Tim Burton’s wildly revisionist (and Coppola-produced) Sleepy Hollow in 1999, and some misses. For instance, did you remember that Julia Roberts at the height of her stardom appeared in a revisionist take on Robert Louis Stevenson’s The Strange Case of Dr. Jekyll and Mr. Hyde where she played the not-so-good doctor’s maid? It’s called Mary Reilly (1996), by the by.

    Denzel Washington and Keanu Reeves in Much Ado About Nothing
    The Samuel Goldwyn Company

    The Resurgence of Shakespeare

    Of course when talking about classic literature and storytelling, one name rises above most others in the schools and curriculums of the English-speaking world. Yet curiously it was only in the 1990s that someone really lit on the idea of making a movie directly based on the Bard tailored almost exclusively for that demographic: Baz Luhrmann in 1996, who reconfigured the tragedy of Romeo and Juliet into the visual language of MTV. He even stylized the title as William Shakespeare’s Romeo + Juliet.

    That proved the tip of an anachronistic iceberg whose cast included Leonardo DiCaprio at the height of his heartthrob powers as Romeo and real-life teenager Claire Danes as his Capulet amore. Their Verona was a Neverland composite of Miami, Rio de Janeiro, and the nightly news, with hyper music video editing and frenetic neon-hued melodrama. Some older scholars viewed Luhrmann’s anachronisms as an abomination, but as a Millennial, I can attest we loved this thing back in the day. Many still do.

    But it was hardly the first box office breakout for Shakespeare in the ‘90s. When the decade began, the helmer of another cinematic Romeo and Juliet classic from a different era, Franco Zeffirelli, attempted to make Hamlet exciting for “kids these days” by casting Mel Gibson right in the midst of his Lethal Weapon popularity as the indecisive Dane. To the modern eye, it is hard to remember Gibson was a heartthrob of sorts in the ‘80s and early ‘90s—or generally viewed as a dashing star worthy of heroic leading men roles.

    Nonetheless, there is quite a bit to like about Hamlet (1990) if you can look past Gibson’s off-screen behavior in the following decades, or the fact Zeffirelli cuts what is a four-hour play down to less than 2.5 hours. Gibson actually makes for a credible and genuinely mad Hamlet (perhaps not a surprise now), and Zeffirelli mines the medieval melancholy of the story well with production design, costumes, and location shooting at real Norman castles. Plus, Helena Bonham Carter remains the best Ophelia ever put to screen. Hamlet (1990) would eventually be overshadowed, though, both by Gibson’s awful behavior and because of a much grander and bombastic adaptation from the man who became the King of Shakespeare Movies in the ‘90s: Kenneth Branagh.

    Aye, Branagh might deserve the most credit for the Shakespearean renaissance in this era, beginning with his adaptation of Henry V (1989), which featured the makings of Branagh’s troupe of former RSC favorites turned film actors: Derek Jacobi, Brian Blessed, and of course his future wife (and ex), Emma Thompson. Together the pair would mount what is in this writer’s opinion the best film ever based on a Shakespeare play, the divine and breezy Much Ado About Nothing (1993), a perfect encapsulation of perhaps the first romantic comedy ever written that features Branagh and Thompson as the sharp-tongued, dueling lovers Benedict and Beatrice. It also features Denzel Washington as a dashing Renaissance prince, Kate Beckinsale in her breakout role, and a gloriously over-the-top score by Patrick Doyle.

    It would define the style of Branagh’s following ‘90s efforts, whether they went off-the-rails like in the aforementioned Frankenstein, or right back on them in the 70mm-filmed, ultra wide and sunny adaptation of Hamlet he helmed in 1996. Avoiding the psychological and Freudian interpretations of the Danish prince chased by Olivier and Zeffirelli, Branagh turns Hamlet into a romantic hero spearheading an all-star ensemble cast. At the play’s full four-hour length, Hamlet (1996) is indulgent. Yet somehow that befits the material. Branagh would also star as Iago in Oliver Parker’s Othello (1995) opposite Laurence Fishburne and reconfigure the Bard as a musical in his own directorial effort, Love’s Labour’s Lost (2000).

    It paved the way for more outside-the-box Shakespeare movies by the end of the decade like Julie Taymor’s deconstructionist Titus (1999) and the A Midsummer Night’s Dream from 1999 where Kevin Kline turns into an ass and makes out with Michelle Pfeiffer.

    Paul Rudd and Alicia Silverstone in Clueless
    CBS via Getty Images

    The Birth of the Teenage Shakespeare Remix (and Austen, and Chaucer, and…)

    As popular as the Shakespeare movie became in the ‘90s, what’s curiously unique about this era is the simultaneous rise of movies that adapted either the Bard or other highly respected literary writers and turned them into a pure teenage dream. We’re talking moving past modernizing Romeo and Juliet like Luhrmann did, or repurposing it for high New York society like Leonard Bernstein and Stephen Sondheim aimed with West Side Story.

    These were straight, unapologetic youth films that also proved clever reworkings of classic storytelling structure. Among the best directly derived from Shakespeare is the movie that made Julia Stiles and Heath Ledger Gen-X icons, 10 Things I Hate About You (1999), a happily campy update of The Taming of the Shrew set in a fairytale high school also populated by future Christopher Nolan favorites like Joseph Gordon-Levitt and David Krumholtz. Stiles would, in fact, do this kind of remix a number times in the more serious-faced modernization of Othello, O (2000), which also starred Mekhi Phifer as a tragically distrusting high school sports star instead of warrior, and Michael Almereyda and Ethan Hawke’s own Hamlet (2000), the third Hamlet movie in 10 years, albeit this one set in turn-of-the-century NYC.

    Ledger also returned to the concept by adapting another, even older literary giant, in this case the medieval poet Geoffrey Chaucer, for A Knight’s Tale (2001), an anachronistic blending of the medieval and modern where peasants grooved in the jousting tournament stands to Queen. There was also the strange attempt to turn Pierre Choderlos de Laclos’ Dangerous Liaisons from 1782 into an erotic thriller for teens (the ‘90s were weird, huh?) via the lusty Cruel Intentions (1999).

    However, easily the best of these remains Amy Heckerling’s Clueless (1995), a pitch perfect transfer of Jane Austen’s Emma from the Regency period to a fairytale version of 1990s Beverly Hills. Foregoing modern fads and simply inventing her own—with the assumption anything she wrote in 1994 would be dated by ’95—Heckerling create a faux yet now authentically iconic language and fashion style via Cher (Alicia Silverstone), a charmed SoCal princess who is so well-meaning in her matchmaking mischief that she defies any attempts to detest her entitlement or vanity. You kind of are even low-key chill that the happy ending is she hooks up with her step brother (Paul Rudd). It’s a classic!

    And the Rest

    There are many, many more examples we could examine from this era. These can include the sublime like the Gillian Armstrong-directed Little Women of 1994 starring Winona Ryder, Claire Danes, and Kirsten Dunst; and they can include the wretched like the Demi Moore and Gary Oldman-led The Scarlet Letter (1995). There were more plays adapted, a la Arthur Miller’s The Crucible (again with Ryder and Day-Lewis!), and then those that just had some fun with playwrights, as seen in the over-celebrated Shakespeare in Love (1998). The inklings of the sword and sandals’ return in 2000 was even hinted at by Mel Gibson going full medieval (and ahistorical) on the costumed drama in Braveheart (1995).

    More than a few of these won Best Picture Oscars as well, including Braveheart, Shakespeare in Love, and James Cameron’s little 1997 movie you might have heard about elsewhere: Titanic. And yet, this type of film has by and large gone away. Once in a while one comes along that still works, such as Greta Gerwig’s own revisionist interpretation of Little Women. That beautiful film was a good-sized hit in 2019, but it did not exactly usher in a new era of literary adaptations.

    Now such projects, like everything else not considered four-quadrant intellectual property by studio bean counters, is mostly relegated to long-form stream series. Which in some cases is fine. Many would argue the best version of Pride & Prejudice was the BBC production… also from the ‘90s, mind. But whether it is original period piece films or adaptations, unless you’re Robert Eggers (who arguably isn’t making films for the same mainstream sensibility the likes of Gerwig or, for that matter, Coppola were), period piece storytelling and “great adaptations” have been abandoned to the small screen and full-on wish fulfillment anachronisms like Bridgerton.

    This seems due to studios increasingly eschewing anything that isn’t reliably based on a brand that middle-aged adults loved. But in that case… it might be worth reminding them that ‘90s kids are getting older and having children of their own. There may again be a market beyond the occasional Gerwig swing, or Eggers take on Dracula, for classic stories; a new audience being raised to want modern riffs inspired by tales that have endured for years and centuries. These stories are mostly in the public domain too. And recent original hits like Sinners suggests you don’t even need a classic story to connect with audiences. So perhaps once again, a play’s the thing in which they can catch the conscience of the… consumer? Or something like that.

    The post The 1990s Were a Golden Age for Period Piece Movies and Literary Adaptations appeared first on Den of Geek.

  • Buffy the Vampire Slayer Reboot Pilot Casts Its Chosen One

    Buffy the Vampire Slayer Reboot Pilot Casts Its Chosen One

    Generation Alpha’s has just been announced, and in every era there is a chosen one. 15-year-old American Horror Story and Star Wars: Skeleton Crew actor Ryan Kiera Armstrong will be joining OG Buffy the Vampire Slayer star ( and reboot producer ) Sarah Michelle Gellar in the series revival Hulu pilot. [ …] There have been no further character details.

    The article Buffy the Vampire Slayer Reset Captain Puts Its Chosen One appeared initially on Den of Geek.

    Recently a friend mentioned how much of a pity it was that, generally speaking, there are few of those secret” traditional” reimaginings now like the ones we had growing up. And I came to an agreement after a moment of reflection. Children and teens of the ‘ 90s were treated to an shame of treasures when it came to the Bard and Bard-adjacent pictures. Almost every week appeared to feature yet another development of William Shakespeare, Jane Austen, or Geoffrey Chaucer, all reworked with a smile and a push to appeal to teens who read the same scriptures in high school or university.

    But then when looking back at the push of 1990s film beyond simply “teen movies”, it was more than just Julia Stiles and Heath Ledger vehicles that were getting the traditional treatment. In fact the ‘ 90s, and to a large extent the ‘ 80s as well, was an era ripe with indie studios and Hollywood majors treating classic literature ( if largely of the English variety ) with the sanctity nowadays reserved for comic books and video games. Some of the most creative or ambitious artists in the industry were looking to trade in the guns and cruelty of New Hollywood from a decade or two earlier for the even more terrible boundaries of bras and top hats.

    cnx. powershell. push ( function ( ) {cnx ( {playerId:” 106e33c0-3911-473c-b599-b1426db57530″, }). render ( “0270c398a82f44f49c23c16122516796” ), }),

    We saw some of the most honest and enduring adaptations of Dickens or Louisa May Alcott making it onto the screen, and Shakespeare was probably bigger organization in tinsel town than at any other time during this time. Why is that and can it occur again? Let’s take a look back in time to the era of period-themed tragedies and extravagant intellectual adaptations…

    Helena Bonham Carter in A Room with a View

    Mozart and Merchant Ivory

    Since the beginning of the platform, moviemakers have looked up at well-worn and common stories for ideas and market experience. In 1907, Georges Méliès adapted Hamlet into a around 10-minute silent little after making his persistent trip to the moon. And of course before Kenneth Branagh, Laurence Olivier had Hollywood falling in love with the Bard… at least as long it was Larry in the stockings.

    Even so, artistic adaptations were frequently constrained, especially in Hollywood, where directors had to contend with the Hays Code’s restrictions on censorship and preconceived ideas about what American audiences would find appealing. The most popular costumed plays tended to so get vanity projects or something of a more dramatic hue—think religious or swords and sandals epics.

    So it’s difficult to point to an exact moment where that changed in the 1980s, yet we’d hazard to suggest the close together Oscar seasons of 1984 and 1986 had a lot to do with it. After all, the first was the year Milo Forman’s Amadeus won Best Picture, and the second was the year James Ivory and Ismail Merchant’s lush adaptation of E. M. Forster’s A Room with a View festigt our conception of what a” Merchant Ivory” film could be. Considered by Forster scholars one of the author’s slighter works, the film had critics like Roger Ebert swooning that it was a masterpiece.

    In the case of Amadeus, the director of One Flew Over the Cuckoo’s Nest ( 1975 ), a zeitgeist-shaping portrait of contemporary oppression and control from about a decade earlier, was adapting Mozart’s life story into a punk rock tragicoma. Based on a Peter Shaffer play of the same name, Forman and Shaffer radically reimagined the story, making it both funnier and darker as Forman strove to pose Mozart as a modern day rebel iconoclast with his wig resembling as much Sid Vicious as the Age of Enlightenment. Located atop Tom Hulce’s giggling head, it signaled a movie that had all the trappings of melodrama but felt accessible and exciting to a wide modern audience.

    It continued to do relatively well and win Best Picture. While not the first period film to do so, it was the first in a long while set in what could be construed as the distant past ( Richard Attenborough’s Gandhi won the year before but that was based on a subject matter in the living memory of most Academy voters ). Otherwise, the majority of recent winners were dramas or dramedies about contemporary life, like Annie Hall ( 1977 ), Kramer vs. Kramer ( 1979 ), and The Deer Hunter ( 1978 ). They reflected an audience that wanted to get away from the artificiality of their parents ‘ cinema, which in the U. S. associated historical costumes with the ( grand ) phoniness of Ben-Hur ( 1959 ) or Oliver! ( 1968 ).

    However, the British masterpiece A Room with a View, which established this as the start of a well-known trend, was released a few years later. To be sure, the partnership of Merchant and Ivory had been going for more than 20 years by the time they got to adapting Forster, including with several other costumed dramas and period pieces. However, those movies were paired with contemporary comedies and dramas like Jane Austen in Manhattan in 1980 and The Guru in 1969. More importantly, all of these films tended to be art house pictures, small chamber pieces intended for a limited audience.

    Yet as the marketing campaign would later trumpet about A Room with a View—the ethereal romantic dramedy which introduced Daniel Day-Lewis and a fresh-faced Helena Bonham Carter to the U. S. —this movie had the “highest single theatre gross in the country”! ( It’s fun to recall a time when a movie could be a hit in New York if it were just selling out every day. ) The film’s combination of Forster’s wry satire and cynicism about English aristocracy in the late Victorian and early Edwardian era, coupled with the sweeping romance of Puccini arias and Tuscan countrysides, made it a massive success.

    It also defined what would become the” Merchant Ivory” period piece forever after, including in upcoming Oscar and box office superstars like Howard’s End ( 1992 ), Hopkins and Thompson’s reunion in The Remains of the Day ( 1993 ), and Hopkins and Thompson’s reunion in The Remains of the Day ( 1993 ). These were all distinctly British and understated pictures, with Remains being an outright tragedy delivered in a hushed whisper, but their relative success with a certain type of moviegoer and Academy voter signaled to Hollywood that there was gold up in’ em hills. And soon enough, more than just Forman on the American side was going up there to mine it.

    Wes Studi in Last of the Mohicans
    20th Century Studios

    Martin Scorsese, Michael Mann, and the Auteur’s Costumed Drama

    One of the hottest creatives in Hollywood was Michael Mann in 1990. As the executive producer and sometime-director on NBC’s edgy ( by’ 80s standards ) police drama, Miami Vice, he played a direct hand in proving American television could be “gritty” and artistic. Even the episodes he didn’t helm were defined by the standards he insisted upon—such as never putting cool guys Crockett and Tubbs in a red or brown car. It would conflict with Mann’s series’ neon-light-on-celluloid aesthetic.

    As that series was winding down by 1990, Mann was more in demand than ever to make any film project he might have wanted—something perhaps in-keeping with Vice or gritty crime thrillers he’d made in the &#8217, 80s like serial killer thriller Manhunter ( 1986 ). Instead, he attempted to adapt James Fenimore Cooper’s 19th-century American frontier novel, The Last of the Mohicans, a childhood favorite for the screen. Certainly a problematic text in its original form with its imperial-fantasy riff on the French and Indian War ( or Seven Years War ) where Indigenous tribes in what is today upstate New York were either reduced to the noble or cruel savage stereotypes, the text proved a jumping off point for Mann to craft a gripping, primal, and prestigious film.

    He also made a movie that far exceeded its source material with The Last of the Mohicans being an often wordless opera of big emotions played in silence by Day-Lewis, Madeleine Stowe, and Wes Studi, all while Trevor Jones and Randy Edelman’s musical score looms like thunderclouds across the mountainous landscape. It is a beautiful drama and a high-profile action film, and it did more business in the United States than Tom Cruise’s A Few Good Men and Disney’s Beauty and the Beast. It also would create a precedent we&#8217, d see followed time and again throughout the rest of the decade.

    Some of the biggest and most well-known filmmakers of the moment, many of whom were praised under auteur theory, were turning to literary classics for a target audience that admired them. After the one-two genre punch of Goodfellas ( 1990 ) and Cape Fear ( 1991 ), Martin Scorsese made one of his most ambitious and underrated films: a stone-cold 1993 masterpiece inspired by an Edith Wharton novel, The Age of Innocence.

    It’s a story that Scorsese argues is just as brutal, if not more so, than his gangster pictures. In fact, The Age of Innocence continues to be the best film adaptation of the Gilded Age in cinematic form. It captures the lush pageantry of the most wealthy New Yorkers ‘ heyday as well as how class snobbery developed into a ruthless tribalism, which ultimately led to the romantic aversions of one conformist attorney (once again Daniel Day-Lewis ) and this would-be divorce love of his life ( Michelle Pfeiffer

    It might not have been a hit in its time, but Ang Lee’s breakout in the U. S. a year later definitely was. The Taiwanese filmmaker was already a star in international and independent cinema with films like The Wedding Banquet ( 1993 ) and Pushing Hands ( 1991 ), but it was only when he directed a flawless adaptation of Jane Austen’s Sense and Sensibility ( 1995 ), that he became a household name and would soon have received approval for films like Crouching Tiger, Hidden Dragon ( 1999 ), and Hulk ( 1999 ). Sense and Sensibility benefits greatly, too, from a marvelous cast with Emma Thompson, Hugh Grant, Kate Winslet, and Alan Rickman among its ensemble. It also captured the sophisticated satirical and melancholic underpinnings of Austen’s pen that most previous Hollywood adaptations never scratched.

    It established a standard by which the majority of the best Austen adaptations to this day are measured, whether it be Joe Wright and Keira Knightley’s cinematic adaptation of Pride and Prejudice a decade later, any attempts at Emma from the 1990s with Gwyneth Paltrow to this decade with Anya Taylor-Joy, or even Netflix’s most recent Dakota Johnson-led Persuasion adaptation.

    Lucy in Bram Stoker's Dracula
    Columbia / Sony

    A Gods and Monsters-A Dark Universe

    Meanwhile, right before Columbia Pictures greenlit Scorsese’s The Age of Innocence and later Gillian Armstrong’s still delightful ( and arguably definitive ) interpretation of Little Women in 1994, the same studio signed off on its first period piece with Winona Ryder attached to star. And it was Dracula.

    Bram Stoker’s Dracula was Francis Ford Coppola‘s wacky and magnificent reimagining of Stoker’s definitive Victorian novel, which was then viewed as a fad by rivals who snickeredered at the time by rivals who snickered in reference to a notorious Brian De Palma bomb from 1990. Published in 1897 with on-the-nose metaphors for London society’s anxieties over foreigners, sexual promiscuity and disease, and the so-called” New Woman” working in the professional classes, Coppola saw all of that potential in the well-worn and adapted vampire novel. If he could combine all those elements into an exciting and anachronistic fever dream for the MTV generation, he correctly predicted there would be a box office hit.

    Love or hate Coppola’s looseness with Stoker’s novel—which is pretty audacious since he put the author’s name in the title—Coppola crafted one of the most sumptuous and expensive depictions of Victorian society ever put onscreen, winning costume designer Eiko Ishioka an Oscar for the effort. He also made an unexpected holiday hit that played like bloody gangbusters alongside Home Alone 2 and Aladdin that winter.

    It established a standard for what can in retrospect be regarded as a pseudo-“dark universe” of classic literary monsters getting ostensibly faithful and expensive adaptations by Hollywood. Coppola himself produced Kenneth Branagh’s Mary Shelley’s Frankenstein ( 1994 ), a film that is actually in many ways closer to the thematic letter of its author than Bram Stoker’s Dracula ever was. Although it was a worse film than the one that failed, it looked fantastic as the only major Frankenstein film to bear in mind Shelley’s depiction of the late 18th century.

    Yet while Frankenstein failed, Tom Cruise and Neil Jordan would have a lot of success in the same year adapting Anne Rice’s Interview with the Vampire. The book admittedly was recent, having been published in 1976, but the story’s roots and setting in 18th and 19th century bayou occultism were not. In a scene dripping in homoeroticism, the actor who played Top Gun‘s Maverick would stick fangs into a young Brad Pitt’s neck.

    This trend continued throughout the’ 90s with some successes, like Tim Burton’s wildly revisionist ( and Coppola-produced ) Sleepy Hollow in 1999, and some misses. Do you recall, for instance, Julia Roberts playing the not-so-good doctor’s maid in a revisionist version of Robert Louis Stevenson’s The Strange Case of Dr. Jekyll and Mr. Hyde at the height of her stardom? It’s called Mary Reilly ( 1996 ), by the by.

    Denzel Washington and Keanu Reeves in Much Ado About Nothing
    The Samuel Goldwyn Company

    Shakespeare’s Resurrection

    Of course when talking about classic literature and storytelling, one name rises above most others in the schools and curriculums of the English-speaking world. Intriguingly, it was only in the 1990s that someone genuinely sprang up about creating a film that was almost exclusively based on the Bard: Baz Luhrmann, who translated Romeo and Juliet’s tragedy into MTV’s visual language in 1996. He even stylized the title as William Shakespeare’s Romeo + Juliet.

    That proved the tip of an anachronistic iceberg whose cast included Leonardo DiCaprio at the height of his heartthrob powers as Romeo and real-life teenager Claire Danes as his Capulet amore. With hyper music video editing and frenetic neon-hued melodrama, their Verona was a Neverland composite of Miami, Rio de Janeiro, and the nightly news. Some older scholars viewed Luhrmann’s anachronisms as an abomination, but as a Millennial, I can attest we loved this thing back in the day. Many still do so.

    But it was hardly the first box office breakout for Shakespeare in the ‘ 90s. When the decade began, the helmer of another cinematic Romeo and Juliet classic from a different era, Franco Zeffirelli, attempted to make Hamlet exciting for “kids these days” by casting Mel Gibson right in the midst of his Lethal Weapon popularity as the indecisive Dane. It’s difficult to remember Gibson as a heartthrob of sorts in the 1980s and early 1990s or as a star-dwelling hero worthy of heroic leading man roles in today’s world.

    Nonetheless, there is quite a bit to like about Hamlet ( 1990 ) if you can look past Gibson’s off-screen behavior in the following decades, or the fact Zeffirelli cuts what is a four-hour play down to less than 2.5 hours. Although Gibson actually makes for a credible and genuinely mad Hamlet, it’s not surprising that Zeffirelli successfully uses production design, costumes, and location shooting at actual Norman castles to convey the medieval melancholy of the story. Plus, Helena Bonham Carter remains the best Ophelia ever put to screen. Hamlet ( 1990 ) would eventually be overshadowed, though, both by Gibson’s awful behavior and because of a much grander and bombastic adaptation from the man who became the King of Shakespeare Movies in the’ 90s: Kenneth Branagh.

    Aye, Branagh might get the most credit for the Shakespearean revival in this era, starting with his 1989 adaptation of Henry V, which featured Derek Jacobi, Brian Blessed, and of course his future wife Emma Thompson. Together the pair would mount what is in this writer’s opinion the best film ever based on a Shakespeare play, the divine and breezy Much Ado About Nothing ( 1993 ), a perfect encapsulation of perhaps the first romantic comedy ever written that features Branagh and Thompson as the sharp-tongued, dueling lovers Benedict and Beatrice. Additionally, it features Kate Beckinsale in her breakout role, Denzel Washington as a shrewd Renaissance prince, and a gloriously over-the-top score by Patrick Doyle.

    It would define the style of Branagh’s following’ 90s efforts, whether they went off-the-rails like in the aforementioned Frankenstein, or right back on them in the 70mm-filmed, ultra wide and sunny adaptation of Hamlet he helmed in 1996. Avoiding the psychological and Freudian interpretations of the Danish prince chased by Olivier and Zeffirelli, Branagh turns Hamlet into a romantic hero spearheading an all-star ensemble cast. Hamlet ( 1996 ) is indulgent at the play’s length of four hours. Yet somehow that befits the material. In his own directorial effort, Love’s Labour’s Lost ( 2000 ), Branagh would reprise his role as Iago in Laurence Fishburne’s Othello ( 1995 ).

    It paved the way for more outside-the-box Shakespeare movies by the end of the decade like Julie Taymor&#8217, s deconstructionist Titus ( 1999 ) and the A Midsummer Night’s Dream from 1999 where Kevin Kline turns into an ass and makes out with Michelle Pfeiffer.

    Paul Rudd and Alicia Silverstone in Clueless
    CBS via Getty Images

    The Teenage Shakespeare Remix ( and Austen, Chaucer, and others ): The Birth of the…

    As popular as the Shakespeare movie became in the’ 90s, what’s curiously unique about this era is the simultaneous rise of movies that adapted either the Bard or other highly respected literary writers and turned them into a pure teenage dream. We’re discussing whether to modernize Romeo and Juliet like Luhrmann did or to repurpose it for high New York society like Leonard Bernstein and Stephen Sondheim did with West Side Story.

    These were straight, unapologetic youth films that also proved clever reworkings of classic storytelling structure. Among the best directly derived from Shakespeare is the movie that made Julia Stiles and Heath Ledger Gen-X icons, 10 Things I Hate About You ( 1999 ), a happily campy update of The Taming of the Shrew set in a fairytale high school also populated by future Christopher Nolan favorites like Joseph Gordon-Levitt and David Krumholtz. In fact, Tiles would perform this kind of remix in the more serious-faced modernization of Hamlet, O ( 2000 ), which featured Mekhi Phifer as a tragically distrusting high school sports star rather than a warrior, and Michael Almereyda and Ethan Hawke’s Hamlet, the third Hamlet film in ten years, but this one is set in turn-of-the-century NYC.

    Ledger also returned to the concept by adapting another, even older literary giant, in this case the medieval poet Geoffrey Chaucer, for A Knight’s Tale ( 2001 ), an anachronistic blending of the medieval and modern where peasants grooved in the jousting tournament stands to Queen. There was also the odd attempt to make Dangerous Liaisons by Pierre Choderlos de Laclos from 1782 an erotic thriller for teenagers ( the 1990s were weird, huh? )? via <a href=””>the lusty Cruel Intentions ( 1999 ).

    However, easily the best of these remains Amy Heckerling’s Clueless ( 1995 ), a pitch perfect transfer of Jane Austen’s Emma from the Regency period to a fairytale version of 1990s Beverly Hills. Cher ( Alicia Silverstone ), a charmed SoCal princess who is so well-intentioned in her matchmaking mischief, defies any attempts to detest her entitlement or vanity by avoiding modern trends and simply inventing her own. You kind of are even low-key chill that the happy ending is she hooks up with her step brother ( Paul Rudd ). It’s a timeless!

    And the Rest

    There are many, many more examples we could examine from this era. These include the sublime, like Winona Ryder, Claire Danes, and Kirsten Dunst in the 1994 film Little Women, and the depressing, like the pathetic, The Scarlet Letter, which was helmed by Gary Oldman and Gillian Armstrong. There were more plays adapted, a la Arthur Miller’s The Crucible ( again with Ryder and Day-Lewis! ), and then those who simply enjoyed playing some Shakespeare in Love ( 1998 ), which were overcelebrated. The inklings of the sword and sandals return in 2000 was even hinted at by Mel Gibson going full medieval ( and ahistorical ) on the costumed drama in Braveheart ( 1995 ).

    More than a few of these won Best Picture Oscars as well, including Braveheart, Shakespeare in Love, and James Cameron’s little 1997 movie you might have heard about elsewhere: Titanic. And yet, by and large, this kind of film has vanished. Once in a while one comes along that still works, such as Greta Gerwig’s own revisionist interpretation of Little Women. That gorgeous movie was a big hit in 2019, but it did not exactly usher in a new era of literary adaptations.

    Now such projects, like everything else not considered four-quadrant intellectual property by studio bean counters, is mostly relegated to long-form stream series. Which in some cases is fine. The BBC production, which is also from the 1990s, is arguably the best version of Pride & Prejudice, in my opinion. But whether it is original period piece films or adaptations, unless you’re Robert Eggers ( who arguably isn’t making films for the same mainstream sensibility the likes of Gerwig or, for that matter, Coppola were ), period piece storytelling and “great adaptations” have been abandoned to the small screen and full-on wish fulfillment anachronisms like Bridgerton.

    This appears to be because studios are increasingly avoiding anything that isn’t consistently based on a brand that middle-aged adults adored. But in that case … it might be worth reminding them that’ 90s kids are getting older and having children of their own. There may again be a market beyond the occasional Gerwig swing, or Eggers take on Dracula, for classic stories, a new audience being raised to want modern riffs inspired by tales that have endured for years and centuries. These tales are primarily in the public domain. And recent original hits like Sinners suggests you don&#8217, t even need a classic story to connect with audiences. Perhaps once more, a play is the vehicle by which they can capture the conscience of the consumer? Or something like that.

    The post The 1990s Were a Golden Age for Period Piece Movies and Literary Adaptations appeared first on Den of Geek.

  • Asynchronous Design Critique: Getting Feedback

    Asynchronous Design Critique: Getting Feedback

    “Any comment?” is probably one of the worst ways to ask for feedback. It’s vague and open ended, and it doesn’t provide any indication of what we’re looking for. Getting good feedback starts earlier than we might expect: it starts with the request. 

    It might seem counterintuitive to start the process of receiving feedback with a question, but that makes sense if we realize that getting feedback can be thought of as a form of design research. In the same way that we wouldn’t do any research without the right questions to get the insights that we need, the best way to ask for feedback is also to craft sharp questions.

    Design critique is not a one-shot process. Sure, any good feedback workflow continues until the project is finished, but this is particularly true for design because design work continues iteration after iteration, from a high level to the finest details. Each level needs its own set of questions.

    And finally, as with any good research, we need to review what we got back, get to the core of its insights, and take action. Question, iteration, and review. Let’s look at each of those.

    The question

    Being open to feedback is essential, but we need to be precise about what we’re looking for. Just saying “Any comment?”, “What do you think?”, or “I’d love to get your opinion” at the end of a presentation—whether it’s in person, over video, or through a written post—is likely to get a number of varied opinions or, even worse, get everyone to follow the direction of the first person who speaks up. And then… we get frustrated because vague questions like those can turn a high-level flows review into people instead commenting on the borders of buttons. Which might be a hearty topic, so it might be hard at that point to redirect the team to the subject that you had wanted to focus on.

    But how do we get into this situation? It’s a mix of factors. One is that we don’t usually consider asking as a part of the feedback process. Another is how natural it is to just leave the question implied, expecting the others to be on the same page. Another is that in nonprofessional discussions, there’s often no need to be that precise. In short, we tend to underestimate the importance of the questions, so we don’t work on improving them.

    The act of asking good questions guides and focuses the critique. It’s also a form of consent: it makes it clear that you’re open to comments and what kind of comments you’d like to get. It puts people in the right mental state, especially in situations when they weren’t expecting to give feedback.

    There isn’t a single best way to ask for feedback. It just needs to be specific, and specificity can take many shapes. A model for design critique that I’ve found particularly useful in my coaching is the one of stage versus depth.

    Stage” refers to each of the steps of the process—in our case, the design process. In progressing from user research to the final design, the kind of feedback evolves. But within a single step, one might still review whether some assumptions are correct and whether there’s been a proper translation of the amassed feedback into updated designs as the project has evolved. A starting point for potential questions could derive from the layers of user experience. What do you want to know: Project objectives? User needs? Functionality? Content? Interaction design? Information architecture? UI design? Navigation design? Visual design? Branding?

    Here’re a few example questions that are precise and to the point that refer to different layers:

    • Functionality: Is automating account creation desirable?
    • Interaction design: Take a look through the updated flow and let me know whether you see any steps or error states that I might’ve missed.
    • Information architecture: We have two competing bits of information on this page. Is the structure effective in communicating them both?
    • UI design: What are your thoughts on the error counter at the top of the page that makes sure that you see the next error, even if the error is out of the viewport? 
    • Navigation design: From research, we identified these second-level navigation items, but once you’re on the page, the list feels too long and hard to navigate. Are there any suggestions to address this?
    • Visual design: Are the sticky notifications in the bottom-right corner visible enough?

    The other axis of specificity is about how deep you’d like to go on what’s being presented. For example, we might have introduced a new end-to-end flow, but there was a specific view that you found particularly challenging and you’d like a detailed review of that. This can be especially useful from one iteration to the next where it’s important to highlight the parts that have changed.

    There are other things that we can consider when we want to achieve more specific—and more effective—questions.

    A simple trick is to remove generic qualifiers from your questions like “good,” “well,” “nice,” “bad,” “okay,” and “cool.” For example, asking, “When the block opens and the buttons appear, is this interaction good?” might look specific, but you can spot the “good” qualifier, and convert it to an even better question: “When the block opens and the buttons appear, is it clear what the next action is?”

    Sometimes we actually do want broad feedback. That’s rare, but it can happen. In that sense, you might still make it explicit that you’re looking for a wide range of opinions, whether at a high level or with details. Or maybe just say, “At first glance, what do you think?” so that it’s clear that what you’re asking is open ended but focused on someone’s impression after their first five seconds of looking at it.

    Sometimes the project is particularly expansive, and some areas may have already been explored in detail. In these situations, it might be useful to explicitly say that some parts are already locked in and aren’t open to feedback. It’s not something that I’d recommend in general, but I’ve found it useful to avoid falling again into rabbit holes of the sort that might lead to further refinement but aren’t what’s most important right now.

    Asking specific questions can completely change the quality of the feedback that you receive. People with less refined critique skills will now be able to offer more actionable feedback, and even expert designers will welcome the clarity and efficiency that comes from focusing only on what’s needed. It can save a lot of time and frustration.

    The iteration

    Design iterations are probably the most visible part of the design work, and they provide a natural checkpoint for feedback. Yet a lot of design tools with inline commenting tend to show changes as a single fluid stream in the same file, and those types of design tools make conversations disappear once they’re resolved, update shared UI components automatically, and compel designs to always show the latest version—unless these would-be helpful features were to be manually turned off. The implied goal that these design tools seem to have is to arrive at just one final copy with all discussions closed, probably because they inherited patterns from how written documents are collaboratively edited. That’s probably not the best way to approach design critiques, but even if I don’t want to be too prescriptive here: that could work for some teams.

    The asynchronous design-critique approach that I find most effective is to create explicit checkpoints for discussion. I’m going to use the term iteration post for this. It refers to a write-up or presentation of the design iteration followed by a discussion thread of some kind. Any platform that can accommodate this structure can use this. By the way, when I refer to a “write-up or presentation,” I’m including video recordings or other media too: as long as it’s asynchronous, it works.

    Using iteration posts has many advantages:

    • It creates a rhythm in the design work so that the designer can review feedback from each iteration and prepare for the next.
    • It makes decisions visible for future review, and conversations are likewise always available.
    • It creates a record of how the design changed over time.
    • Depending on the tool, it might also make it easier to collect feedback and act on it.

    These posts of course don’t mean that no other feedback approach should be used, just that iteration posts could be the primary rhythm for a remote design team to use. And other feedback approaches (such as live critique, pair designing, or inline comments) can build from there.

    I don’t think there’s a standard format for iteration posts. But there are a few high-level elements that make sense to include as a baseline:

    1. The goal
    2. The design
    3. The list of changes
    4. The questions

    Each project is likely to have a goal, and hopefully it’s something that’s already been summarized in a single sentence somewhere else, such as the client brief, the product manager’s outline, or the project owner’s request. So this is something that I’d repeat in every iteration post—literally copy and pasting it. The idea is to provide context and to repeat what’s essential to make each iteration post complete so that there’s no need to find information spread across multiple posts. If I want to know about the latest design, the latest iteration post will have all that I need.

    This copy-and-paste part introduces another relevant concept: alignment comes from repetition. So having posts that repeat information is actually very effective toward making sure that everyone is on the same page.

    The design is then the actual series of information-architecture outlines, diagrams, flows, maps, wireframes, screens, visuals, and any other kind of design work that’s been done. In short, it’s any design artifact. For the final stages of work, I prefer the term blueprint to emphasize that I’ll be showing full flows instead of individual screens to make it easier to understand the bigger picture. 

    It can also be useful to label the artifacts with clear titles because that can make it easier to refer to them. Write the post in a way that helps people understand the work. It’s not too different from organizing a good live presentation. 

    For an efficient discussion, you should also include a bullet list of the changes from the previous iteration to let people focus on what’s new, which can be especially useful for larger pieces of work where keeping track, iteration after iteration, could become a challenge.

    And finally, as noted earlier, it’s essential that you include a list of the questions to drive the design critique in the direction you want. Doing this as a numbered list can also help make it easier to refer to each question by its number.

    Not all iterations are the same. Earlier iterations don’t need to be as tightly focused—they can be more exploratory and experimental, maybe even breaking some of the design-language guidelines to see what’s possible. Then later, the iterations start settling on a solution and refining it until the design process reaches its end and the feature ships.

    I want to highlight that even if these iteration posts are written and conceived as checkpoints, by no means do they need to be exhaustive. A post might be a draft—just a concept to get a conversation going—or it could be a cumulative list of each feature that was added over the course of each iteration until the full picture is done.

    Over time, I also started using specific labels for incremental iterations: i1, i2, i3, and so on. This might look like a minor labelling tip, but it can help in multiple ways:

    • Unique—It’s a clear unique marker. Within each project, one can easily say, “This was discussed in i4,” and everyone knows where they can go to review things.
    • Unassuming—It works like versions (such as v1, v2, and v3) but in contrast, versions create the impression of something that’s big, exhaustive, and complete. Iterations must be able to be exploratory, incomplete, partial.
    • Future proof—It resolves the “final” naming problem that you can run into with versions. No more files named “final final complete no-really-its-done.” Within each project, the largest number always represents the latest iteration.

    To mark when a design is complete enough to be worked on, even if there might be some bits still in need of attention and in turn more iterations needed, the wording release candidate (RC) could be used to describe it: “with i8, we reached RC” or “i12 is an RC.”

    The review

    What usually happens during a design critique is an open discussion, with a back and forth between people that can be very productive. This approach is particularly effective during live, synchronous feedback. But when we work asynchronously, it’s more effective to use a different approach: we can shift to a user-research mindset. Written feedback from teammates, stakeholders, or others can be treated as if it were the result of user interviews and surveys, and we can analyze it accordingly.

    This shift has some major benefits that make asynchronous feedback particularly effective, especially around these friction points:

    1. It removes the pressure to reply to everyone.
    2. It reduces the frustration from swoop-by comments.
    3. It lessens our personal stake.

    The first friction point is feeling a pressure to reply to every single comment. Sometimes we write the iteration post, and we get replies from our team. It’s just a few of them, it’s easy, and it doesn’t feel like a problem. But other times, some solutions might require more in-depth discussions, and the amount of replies can quickly increase, which can create a tension between trying to be a good team player by replying to everyone and doing the next design iteration. This might be especially true if the person who’s replying is a stakeholder or someone directly involved in the project who we feel that we need to listen to. We need to accept that this pressure is absolutely normal, and it’s human nature to try to accommodate people who we care about. Sometimes replying to all comments can be effective, but if we treat a design critique more like user research, we realize that we don’t have to reply to every comment, and in asynchronous spaces, there are alternatives:

    • One is to let the next iteration speak for itself. When the design evolves and we post a follow-up iteration, that’s the reply. You might tag all the people who were involved in the previous discussion, but even that’s a choice, not a requirement. 
    • Another is to briefly reply to acknowledge each comment, such as “Understood. Thank you,” “Good points—I’ll review,” or “Thanks. I’ll include these in the next iteration.” In some cases, this could also be just a single top-level comment along the lines of “Thanks for all the feedback everyone—the next iteration is coming soon!”
    • Another is to provide a quick summary of the comments before moving on. Depending on your workflow, this can be particularly useful as it can provide a simplified checklist that you can then use for the next iteration.

    The second friction point is the swoop-by comment, which is the kind of feedback that comes from someone outside the project or team who might not be aware of the context, restrictions, decisions, or requirements—or of the previous iterations’ discussions. On their side, there’s something that one can hope that they might learn: they could start to acknowledge that they’re doing this and they could be more conscious in outlining where they’re coming from. Swoop-by comments often trigger the simple thought “We’ve already discussed this…”, and it can be frustrating to have to repeat the same reply over and over.

    Let’s begin by acknowledging again that there’s no need to reply to every comment. If, however, replying to a previously litigated point might be useful, a short reply with a link to the previous discussion for extra details is usually enough. Remember, alignment comes from repetition, so it’s okay to repeat things sometimes!

    Swoop-by commenting can still be useful for two reasons: they might point out something that still isn’t clear, and they also have the potential to stand in for the point of view of a user who’s seeing the design for the first time. Sure, you’ll still be frustrated, but that might at least help in dealing with it.

    The third friction point is the personal stake we could have with the design, which could make us feel defensive if the review were to feel more like a discussion. Treating feedback as user research helps us create a healthy distance between the people giving us feedback and our ego (because yes, even if we don’t want to admit it, it’s there). And ultimately, treating everything in aggregated form allows us to better prioritize our work.

    Always remember that while you need to listen to stakeholders, project owners, and specific advice, you don’t have to accept every piece of feedback. You have to analyze it and make a decision that you can justify, but sometimes “no” is the right answer. 

    As the designer leading the project, you’re in charge of that decision. Ultimately, everyone has their specialty, and as the designer, you’re the one who has the most knowledge and the most context to make the right decision. And by listening to the feedback that you’ve received, you’re making sure that it’s also the best and most balanced decision.

    Thanks to Brie Anne Demkiw and Mike Shelton for reviewing the first draft of this article.