Any ranking project makes an argument through its inclusions and exclusions before it scores anything. This is ours — made fully visible.
A note on the totalThe methodology below produced 1,104 films. A further 74 were added by hand after the dataset was built — films that should have been included but fell outside the six source lists due to genre bias, documentary structural invisibility, or non-Western origin. They are scored on the same seven dimensions and three frameworks. Each one is flagged in the ranking with the specific reason the methodology missed it. An additional 10 films were subsequently added as 98th Academy Awards Best Picture nominees (2026), bringing the core total to 1,114 and the overall dataset to 1,188.
Before scoring a single film, we had to answer a more fundamental question: which films should be in the dataset at all? This is not a trivial decision. Any ranking project makes an argument through its inclusions and exclusions before it scores anything. A dataset drawn entirely from Oscar nominees makes one argument about what cinema is. A dataset drawn entirely from box office records makes a completely different one. We wanted a dataset that could hold both arguments simultaneously — and be honest about what it could and couldn't see.
The 1,114 films in the core No Soft Opinions index were drawn from six source lists using a cross-list comparison methodology designed to surface films with genuine multi-dimensional significance rather than simply aggregating everything that appeared anywhere.
The foundation is every Oscar Best Picture nominee across the Academy's history. To this we added the top 100 unique films from each of five further source lists — selecting films that appeared on those lists but were not already represented through the Oscar nominees. The word "unique" matters: it means each supplementary list contributes meaningfully rather than being drowned out by overlap with the others, and it means a film earns its place through a specific dimension of significance rather than simply accumulating volume. A film that appears on only one list — commercially enormous but critically dismissed, or canonically essential but commercially invisible — still enters through its strongest dimension. Both kinds of significance are respected, and neither is artificially inflated by cross-list frequency.
A film that appears on multiple source lists — The Godfather scores highly across all six — accumulates presence naturally. A film significant on only one axis still enters through its strongest dimension. Both kinds of significance are respected.
A small number of films were added by hand where their absence would have been a conspicuous gap. The Room is the most deliberate example — it appears on none of the six source lists by any conventional measure, yet its participatory midnight screening culture, its so-bad-it's-good canonical status, and its role in defining a category of cinema make it more significant to a complete picture of film culture than many critically respected films that entered automatically. The spoon-throwing ritual, the Tommy Wiseau phenomenon, The Disaster Artist connection — these are real cultural data points that a comprehensive cinema index cannot ignore.
A selection of streaming-first releases were also added to represent a category of filmmaking the source lists structurally cannot capture — films made for platforms that never had conventional theatrical releases and therefore generate no box office data, no Oscar eligibility in most cases, and limited critical infrastructure. Six films were added on this basis: KPop Demon Hunters, Red Notice, Happy Gilmore 2, Frankenstein, The Gray Man, and Extraction 2. Each is scored on the same seven dimensions and three frameworks as every other film in the dataset, with streaming viewership used as the D1 equivalent and all provisional scores flagged accordingly. They represent the future shape of cinema, and the methodology needed to be honest about how it handles them — including the fact that KPop Demon Hunters has a D6 of 72, the joint-highest post-2015 cultural footprint score in the entire index alongside F1 (2025, the Formula 1 racing drama), driven by a phenomenon that operates almost entirely outside the critical infrastructure the other source lists measure.
The dataset is a very good picture of what Western institutional film culture — the Academy, Anglophone criticism, global box office, Hollywood studio investment — considered significant cinema between 1920 and 2025. It is a less complete picture of what cinema actually was during that period.
The gap between those two things is not a flaw we are embarrassed about. It is a finding we are transparent about. What got included, what got excluded, and why tells you something true about the structures of film culture that no individual ranking can capture.
Because the source lists are overwhelmingly Anglo-American in orientation, and because institutional recognition has historically been gatekept by specific critical and commercial establishments, the dataset inherits real and documentable biases. These are worth naming directly.
The Academy has historically excluded horror and critical canonical lists have been slow to recognise it despite its enormous formal influence. Psycho, The Shining, The Exorcist, Get Out — all in the dataset, with D5 Filmmaker Influence scores among the highest in the index. But the horror films that didn't break through institutional barriers sufficiently to appear on multiple lists are absent, and there are many of them. The genre's formal contribution to cinema is larger than its representation here suggests.
Documentary filmmaking has produced some of the most formally innovative and culturally consequential work in cinema history — the observational grammar of the Maysles brothers, the essay film tradition of Chris Marker and Agnès Varda, the ethical provocations of Errol Morris and Joshua Oppenheimer, the commercial breakthroughs of Michael Moore — but documentary rarely generates Oscar nominations, box office figures, or the RT aggregation presence needed to enter the source lists automatically. The five that are here all crossed into art cinema canonical lists. The vast majority of significant documentary work did not. Man with a Movie Camera scores D5 90 — among the highest filmmaker influence scores in the entire dataset. The gap between that score and documentary's five-film representation tells you something specific about whose cinema gets written about and preserved.
India produces more films annually than Hollywood, has developed entirely distinct formal conventions for musical integration, melodrama, spectacle, and comedy, and has an audience measured in billions. A film can be the highest-grossing Indian film of all time and not appear on any of our six source lists because those lists are structured around Western critical and commercial infrastructure. The absence is not a judgement on Indian cinema. It is a measurement of how completely Western institutional film culture has failed to engage with it.
The entire body of work from Egyptian cinema — one of the oldest and most prolific film industries in the world — is absent. Nigerian cinema, South African cinema, Ethiopian cinema, Senegalese cinema beyond Sembène — all absent. The D5 score for Touki Bouki is 62, reflecting Djibril Diop Mambéty's documented influence on subsequent African filmmakers, but the films that built on that influence are invisible in the dataset because they never appeared on Western canonical lists. This is the most complete blind spot in the dataset and the one we are most aware of.
Japanese and Korean cinema appear reasonably because both broke through Western institutional barriers sufficiently — Kurosawa, Ozu, and Mizoguchi through the art cinema canon, the Korean New Wave through the breakthrough culminating in Parasite. Chinese cinema has Zhang Yimou, Chen Kaige, Edward Yang, and Hou Hsiao-hsien — but only partially. Hong Kong cinema has a handful of entries. The breadth of Thai, Indonesian, Vietnamese, and Filipino cinema that never crossed Western critical thresholds is almost entirely absent.
Brazilian Cinema Novo, the Argentine New Wave, Mexican golden age cinema, Colombian and Chilean cinema — representing decades of formal innovation and cultural significance — are invisible in the dataset not because they are insignificant but because the structures that generated our source lists didn't see them. Amores Perros, City of God, Y Tu Mamá También, and a handful of others are here because they achieved Western critical and commercial crossover. The tradition they emerged from is not.
European animation, Eastern European animation, the National Film Board of Canada tradition, independent animation — all largely absent despite the enormous formal influence of filmmakers like Walerian Borowczyk and Jan Švankmajer on what directors like Lynch and Gilliam subsequently did.
These gaps are not incidental. They reflect whose taste has historically defined institutional film culture, whose criticism got written and published and translated and digitised, and whose cinema was absorbed into the training data of the AI that scored these films. The AI that built this index has read more film criticism than any human alive — and that criticism is predominantly Western, predominantly English-language, and predominantly oriented toward the canonical traditions that the Academy and the major critical publications have recognised.
The biases in this dataset are the biases of that critical infrastructure made visible and measurable. Documenting them honestly is more useful than pretending to correct for them with arbitrary additions. We could have added fifty Bollywood films by hand. We chose not to, because doing so would have broken the cross-list methodology that makes the scores consistent and comparable. What we can do — and what we are committed to doing — is being completely transparent about what the methodology can and cannot see, and keeping the framework open.
Naming the gaps is the easy part. Closing them requires something more specific than good intentions.
To address the Bollywood gap properly would require a parallel source list methodology built entirely from Indian critical infrastructure — Filmfare award histories, Indian box office records adjusted for the Indian market, Indian critical publications, and a filmmaker influence list constructed from documented citations within Indian cinema rather than Western canonical lists. The seven dimensions and three frameworks could apply directly; the source lists that feed the dataset cannot.
To address the African cinema gap would require similar source lists built from African institutional recognition — FESPACO award histories, pan-African critical publications, and filmmaker genealogies that trace influence within African cinema rather than measuring influence on Western filmmakers who discovered it. The Touki Bouki D5 score of 62 reflects documented influence on subsequent African filmmakers; that influence network is largely invisible to Western canonical lists.
To address the documentary gap would require a dedicated documentary source list — something like the International Documentary Association's historical awards, critical surveys specifically of documentary, and a filmmaker influence list built around the specific grammar innovations of observational, essay, and direct cinema rather than measuring documentary against narrative film criteria it was never designed to meet.
These are not impossible tasks. They are substantial ones. The framework is ready for them. The source lists are not built yet. If you have expertise in any of these traditions and want to contribute to building them, get in touch.
The seven dimensions and three philosophical frameworks are consistent, transparent, and applicable to any film. Jeanne Dielman and Transformers: Revenge of the Fallen are in the same dataset scored against the same criteria, and that comparison is honest and arguable precisely because the methodology is visible.
That means the index is not a closed canonical list. It is a framework that can appraise any film on the same terms.
If there is a film you believe should be here — a documentary we missed, a Bollywood classic that deserves a score, a horror film the Academy ignored, a film from a tradition the source lists never reached, a recent streaming release that belongs in the conversation — make the case for it. A good submission doesn't need to be long. It needs to argue why the film is significant enough to sit alongside what's already here, and on which of the seven dimensions that significance is strongest. Films that appear on none of the six source lists can still be added if the argument is right — The Room is proof of that. Get in touch on X and make the case.
Make the case on X / Twitter →