Hacker Newsnew | past | comments | ask | show | jobs | submit | hutao's commentslogin

Today, we tend to see The Great Gatsby as a work of historical literature, as it gives a window into the Roaring Twenties. However, F. Scott Fitzgerald did not set out to depict the past; he was depicting his own present. Similarly, Proust's literature is seen as a window into the French high-society of the Belle Epoque, a society in which Proust lived.

Which works today do you think future generations will see as the classics of the 2010s and 2020s? Such may not even necessarily be works of literature; they could be other storytelling mediums, such as film.


> Which works today do you think future generations will see as the classics of the 2010s and 2020s?

South Park?

Maybe collectively those "actually the villain is just misunderstood" movies I hear are becoming a thing recently? They seem like a decent candidate for the "window into the culture of the time" thing.

Some of those wide-audience computer games like Candy Crush and Farmville?


> Maybe collectively those "actually the villain is just misunderstood" movies I hear are becoming a thing recently?

This trend is a good decade or more old. First one I can think of offhand was the 2013 movie Maleficent.


Unironically dril's tweets[1].

Part of the problem of our time is that shared culture has significantly receded. There's little capacity to maintain "classics" as we understand them today. Take any massive artistic output (film, book, TV show) and it's nowadays either not seen/read/heard by more than 20% of the population or it's a flash in the pan hit which will be forgotten in another year or so (e.g. Barbenheimer).

1. https://www.reddit.com/r/dril/comments/cqde0e/pound_for_poun...


The Year is 2026.

Civilization has progressed little in the last 100 years.



The ones that most qualify for me are:

* Minecraft: This will probably hold the position Pac Man holds to us. Single biggest culture-marker perhaps.

* Taylor Swift: She's like Michael Jackson was (perhaps because of access and audience size improvements)

I don't know if Wikipedia or the mainline social media websites would count. People remember The Myspace Era. Tumblr and Twitter have reputations for their culture but would they be classics? Hard to tell.


> Which works today do you think future generations will see as the classics of the 2010s and 2020s?

I think the film Tár will be. It captures the “fake it until you make it” spirit of the present really well along with the god complex and repressed guilt that accompany “making it”. Also the performances and direction are just excellent.


Great choice. Saw a free screening.


There’s a lot of discussion flying back and forth as to whether we’re in a period of cultural stagnation. Obviously you need to heavily discount the possibility of “old fogeyism” and reactionary nostalgia whenever you make such claims, and historically a lot of them have been totally false, but the one argument I have found convincing that we’re not in a good cultural place right now is the difficulty of coming up with such a work.

What work captured the zeitgeist of the 2010s and so far of the 2020s? I certainly can’t think of any novels that did it, far too much of that literary decade was about self-obsessed New Yorkers and had no relevance to anyone else. None of the reactions against it (i.e. Dimes Square) produced anything of lasting note either.


The Social Network movie?


Breaking Bad, if we're lucky. More likely some superhero movie with a title like ThunderMan VI: Dawn of the Mayhem Battalion.


To be fair, it’s a fair assessment. Superhero movies like that are a defining feature of the last two decades, with titles and plots worsening at an exponential rate. Not that prior decades lacked superheroes. They just used to be less superficial.


Game of thrones will be. No main character is safe since Ned Stark was killed on TV.

No ending was going to be good. Sopranos had a controversial ending too.

Silicon Valley captured the feel of the bay area and did a good job with satire.

2000s would be easier, I think 2010 is about the switch to super hero movies.


No one even talks about GOT anymore bc the ending was so bad.

> No ending was going to be good

Why do you say that? Plenty amazing shows have great endings. And GOT isn't some uniquely incredible story. Killing MCs is not new to GOT, either, and you give that show too much credit. Lost did it long before GOT. 24. Grey's Anatomy is SUPER famous for it. They killed off like half the original cast in a single helicopter crash.

ASOIAF wasn't original bc it killed MCs. It was original bc it treated fantasy as political first, fantasy second.


None of those shows were a pop culture phenomenon like got was when it was on.

It only ended a few years ago. The office took over 10 years to have a popularity resurgence.


Lost was for its first season or two. It just got steadily worse instead of suddenly worse like Game of Thrones.


Perhaps George R. R. Martin will finish the books some day, and we'll find out how he thinks it should end.


I’m starting to think he was honest that this was his intended ending, and he’s given up as a result.


I don't doubt that the show ending mirrors the book's intended ending, however a huge part of it is how you get to that ending, which they rushed and fumbled horribly in the show. Like Bran becoming King could work, but not when he basically shows up out of nowhere and nobody knows anything that happened to him or what he is capable of, he basically disappeared for years and when he came back he said a few nonsense things but wasn't involved in much of the politicing that could make him a viable candidate for King. Or Dany going full Mad King, after they spend seasons showing her trying to not be a crazy ruler but then suddenly snapping, instead of going through a series of harder and harder choices that turn out worse each time and drive her to more relatable desperation and violence.

At minimum they needed a full extra season and a full final season, if not more. But without GRRM handholding them throughout the entire plot they completely lost the path.


There is a good ending to Game of Thrones: evil wins, everyone dies. All the fools who pursued their own interests rather than face an annihilating threat get annihilated. It's right there in the show's motto. "Winter is coming."

The writers just lacked the courage to do it. They tried to tack a Disney ending onto a tragedy.


Either that or Cersei being queen would've been the correct ending.

The Lannisters would've had the only real army left without that WWE-style defeat of the Night King. Cersei's consistently outwitted everyone (except Tommen, I guess), and they knew how to buy loyalty.

Instead we ended up with the usual plot armor, and a "twist" that the character that behaved like a tyrannical zealot for 7+ seasons was, in fact, a tyrannical zealot.


"they really liked pedophile vampires for a hot minute in the 2000s"


> TypeScript’s type system is purely structural and exists only at compile time. It has no way to verify that your function actually implements what its signature claims. You can declare that a function transforms a User into a SafeUser, and as long as the return object has the required fields of SafeUser, TypeScript doesn’t care what additional properties might still be lurking in there.

> This is fundamentally different from languages like Rust, where the type system can actually guarantee that if you claim to return an Option<T>, you genuinely can’t return null, the compiler enforces the contract at the language level. Rust’s type system doesn’t just trust your annotations; it verifies them.

This design where types are present at compile-time but disappear at runtime is called type erasure, and it's extremely common. For example, Java's generics are type erased. If you have some Java class Foo<T, U>, in the bytecode it will simply become Foo, and T and U will become Object. Therefore, you cannot use runtime introspection to recover their instantiations.

The remark contrasting TypeScript to Rust seems a little confused. Rust also uses type erasure; types and lifetimes are checked by the compiler, then the compiler produces a native executable, which is just machine code and would not contain type information. Option<&T> could be treated as a pointer T*, because the niche optimization ensures that the Option::None variant is represented as 0 or NULL. If C code were to interact with Rust code via FFI, it would be able to pass a value of 0. However, Rust doesn't have a null value the way that it's commonly understood in languages such as Java, C#, or JavaScript, a distinguished value that denotes a "sentinel" reference that does not refer to any object. I would say that the null reference is semantically a higher-level concept, specific to these particular programming languages.

Philosophically, the notion of type erasure goes all the way back to Curry-style (extrinsic) typing, which is contrasted with Church-style (intrinsic) typing. For example, in Curry-style typing, the program (fun x -> x) is the identity function on all types, while in Church-style typing, each type A has its own identity function, (fun (x : A) -> x) and a program is meaningless without types.

Please correct me if I'm wrong or misunderstood!


I think what the author is trying to say is that the type system of TypeScript is unsound, while Rust's type system is (hopefully) sound.


It looks like the op is actually talking about structural typing vs nominal typing, which makes more sense bc Rust is nominally typed (newtype pattern, for example), whereas Typescript is structurally typed.

And you’re right, this has nothing to do with type erasure.


When I first tried to learn Vulkan, I felt the exact same way. As I was following the various Vulkan tutorials online, I felt that I was just copying the code, without understanding any of it and internalizing the concepts. So, I decided to learn WebGPU (via the Google Dawn implementation), which has a similar "modern" API to Vulkan, but much more simplified.

The commonalities to both are:

- Instances and devices

- Shaders and programs

- Pipelines

- Bind groups (in WebGPU) and descriptor sets (in Vulkan)

- GPU memory (textures, texture views, and buffers)

- Command buffers

Once I was comfortable with WebGPU, I eventually felt restrained by its limited feature set. The restrictions of WebGPU gave me the motivation to go back to Vulkan. Now, I'm learning Vulkan again, and this time, the high-level concepts are familiar to me from WebGPU.

Some limitations of WebGPU are its lack of push constants, and the "pipeline explosion" problem (which Vulkan tries to solve with the pipeline library, dynamic state, and shader object extensions). Meanwhile, Vulkan requires you to manage synchronization explicitly with fences and semaphores, which required an additional learning curve for me, coming from WebGPU. Vulkan also does not provide an allocator (most people use the VMA library).

SDL_GPU is another API at a similar abstraction level to WebGPU, and could also be another easier choice for learning than Vulkan, to get started. Therefore, if you're still interested in learning graphics programming, WebGPU or SDL_GPU could be good to check out.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: