Good, and not because of the diversity drama that the US government wants to shoehorn in here. Any font that makes the uppercase "i" and the lowercase "L" look the same is absolute garbage. Yes, I have a strong opinion about this!
A lot of people installed malware and, to be honest, nothing really happened. They might have had to change their passwords, but it could have been much much worse if Android didn't have good sandboxing.
I hope that Flatpak and similar technologies are adopted more widely on desktop computers. With such security technology existing, giving every application full access to the system is no longer appropriate.
You don't, but as far as I know, Flatpak or Snap are the only practical, low-effort ways to do it on standard distros. There's nothing stopping flatpak-like security from being combined with traditional package management and shared libraries. Perhaps we will see this in the future, but I don't see much activity in this area at the moment.
This is simply not true. Bird flu mainly spreads among wild birds and that is where it has its reservoir. It would still exist even if the world was free of bird farms. It also usually doesn't spread between farms because, in the event of an outbreak, all the animals on the affected farm are culled. At most, bird farms slightly increase overall contact between birds and humans.
I don't have a background in law, but here are some suggestions. The German penal code often imposes harsher punishments for the same offense if a weapon was involved. Rape, for example, carries a minimum sentence of two years. If a weapon is present, it is a minimum of three years. If the weapon is used, the minimum sentence is 5 years.
Before the change, date rape drugs would have fallen under a minimum of three years because of a separate clause.
Classifying them as weapons would also affect crimes other than rape.
Additionally, if legal substances can be used as date rape drugs, classifying them as weapons would give the police more authority to act in certain situations.
It's pretty accurate. I was a bit shocked when I saw that room names were not encrypted. I thought that was such a basic privacy requirement, and it's not hard to implement when you already have message encryption.
Matrix seems to have a lot of these structural flaws. Even the encryption praised in the Reddit post has had problems for years where messages don't decrypt. These issues are patched slowly over time, but you shouldn't need to show me a graph demonstrating how you have slowly decreased the decryption issues. There shouldn't be any to begin with! If there are, the protocol is fundamentally broken.
They are slowly improving everything, with the emphasis on "slowly". It will take years until everything is properly implemented. To answer the question of whether the future of the protocol is promising, I would say yes. This is in no small part because there are currently no real alternatives in this area. If you want an open system, this is the best option.
The decryption problems I've experienced have a been fixed a while ago. There was a push to fix these last year or the year before that, and at this point I'm pretty sure only some outdated or obscure clients with old encryption liberties still suffer from these problems.
The huge amount of unencrypted metadata is pretty hard to avoid with Matrix, though. It's the inevitable result of stuffing encryption into an unencrypted protocol later, rather than designing the protocol to be encrypted from the start.
I've had similar issues with other protocols too, though. XMPP wouldn't decrypt my messages (because apparently I used the wrong encryption for one of the clients), and Signal got into some funky state where I needed to re-setup and delete all of my old messages before I could use it again. Maintained XMPP clients (both of them) seem to have fixed their encryption support and Signal now has backups so none of these problems should happen again, but this stuff is never easy.
Yes, messaging protocols, especially federated ones, are never easy. I just wish we could have skipped the three or four years when Matrix was basically unusable for the average user because end-to-end encryption was switched on by default. Perhaps a clean redesign would have been better. Now they have to change the wheels on a moving car.
> These issues are patched slowly over time, but you shouldn't need to show me a graph demonstrating how you have slowly decreased the decryption issues. There shouldn't be any to begin with! If there are, the protocol is fundamentally broken.
This is wrong, because afaik these errors happen due to corner cases and I really don't like the attitude here.
It's not just a corner case. The issue was so prevalent for years that if it was limited to just a few corner cases, the entire protocol must consist of nothing but corner cases.
It frequently occurred on the "happy path": on a single server that they control, between identical official clients, in the simplest of situations. There really is no excuse.
I'm not saying that building a federated chat network with working encryption is easy. On the contrary, it is very hard. I'm sure the designers had the best intentions, but they simply lacked the competence to overcome such a challenge and ensure the protocol was mostly functional right from the outset.
> The issue was so prevalent for years that if it was limited to just a few corner cases, the entire protocol must consist of nothing but corner cases.
for me it wasn't really; occasionally it would hit me, but mostly it worked, and I have been using it for encrypted communication since 2020.
> It frequently occurred on the "happy path": on a single server that they control, between identical official clients, in the simplest of situations. There really is no excuse.
There still can be technical corner cases in the interaction of clients
> I'm sure the designers had the best intentions, but they simply lacked the competence to overcome such a challenge and ensure the protocol was mostly functional right from the outset.
well, even if this was true, they still were brave enough to try and eventually pull it off eventually. Perhaps complain to the competent people who haven't even tried.
> for me it wasn't really; occasionally it would hit me, but mostly it worked, and I have been using it for encrypted communication since 2020.
I think the statistic said that around 10% of users receive at least one "unable to decrypt" message on any given day. That's a lot. Perhaps not for devs who are accustomed to technical frustrations, but for non-technical people, that's far too frequent. Other messaging systems worked much better.
> There still can be technical corner cases in the interaction of clients
You linked to a German political talk show. If you wanted to show me the talk in which the guy listed reasons such as "network requests can fail and our retry logic is so buggy that it often breaks" and "the application regularly corrupts its internal state, so we have to recover from that, which is not always easily possible", let's just say I wasn't that impressed.
> well, even if this was true, they still were brave enough to try and eventually pull it off eventually. Perhaps complain to the competent people who haven't even tried.
It isn't a problem that the Matrix team are not federated networking experts. At the time, they had already received millions in investment. That's not FAANG money, but it's still enough to contract the right people to help design everything properly.
I'm not mad at them. Matrix was a bold effort that clearly succeeded in its aims. I'm just disappointed that it was so unreliable for such a long time, and still is to some extent.
Once again, we have the situation where someone uses an Apache or BSD licence, only to then wonder why others do exactly what the licence allows. If you want others, especially companies, to play nice, you have to make them do so. Use GPL or AGPL.
Let's hope Rebble doesn't get steamrollered. They did good work when the original company failed its users.
Perhaps a trusted execution environment based anti-cheat system could be possible.
I think Valve said something about working with anti-cheat developers to find a solution for the Steam Deck, but nothing happened. Perhaps they will do something this time.
With a TEE, you could scan the system or even completely isolate your game, preventing even the OS from manipulating it. As a last resort, you could simply blacklist the machine if cheats are detected.
There would probably still be some cheaters, but the numbers would be so low as to not be a problem.
Maybe the user friction would be too much, but I'd be happy for the system to just straight up reboot for games which require anti cheat. So while that game is running, the system is in a verified state. But once you close the game all of your mods and custom drivers can be loaded just fine.
Gaussian splats can have colour components that depend on the viewing direction. As far as I know, they are implemented as spherical harmonics. The angular resolution is determined by the number of spherical harmonic components. If this is too low, all reflection changes will be slow and smooth, and any reflection will be blurred.
Blender is great, but it still can't replace a CAD program.
I tried using it for simple CAD tasks (before geometry nodes were released), but the experience was so poor that I quickly switched to FreeCAD. It was worth it, even though it took some time to learn how to use the new program.
FreeCAD is pretty buggy, confusing, and sometimes limited, but its workflow can't really be replicated with Blender. Once you have worked with a CAD program for a while, you realise that certain things that are almost impossible or annoyingly difficult in Blender can actually be pretty easy.
It would be great if the two programs could be merged. Blender could benefit from better CAD functions, and FreeCAD could benefit from everything else Blender provides.
The CAD Sketcher add-on goes a long way towards making it more usable. I’ve been hacking away at it myself (should go back to finish what I was adding…).
Nice. That wasn't on my radar. I was already happy when Blender made significant progress in parametric modelling with geometry nodes. Together with CAD Sketcher, it looks pretty usable for modelling. I hope the whole thing improves quickly, but I suspect they have a long way to go before Blender can be considered a proper CAD program.
Spiritually, Blender is to FreeCAD what Gimp is to Inkscape or what BMP is to SVG. With Blender you're massaging piles of anonymous polygons so they look right aestheticallY, while with CAD you're composing geometric primitives to make a precise blueprint for a 3D object that just happens to be rendered with polygons. The former is better for art while the latter is better for manufacturing.
A .step or .stp file encodes the model as mathematical shapes, rather than approximating it with polygons, but it doesn't save the entire parametric workflow or history, only the final result. As far as I know, there is no widely adopted file format that also saves this information.
Parent's comparison is pretty great, but it shouldn't be "overdone". It's not really the format that's different/a problem (it's not hard to make a blender object from a CAD design - the same way an SVG can be rendered to PNG, and similarly irreversible in both cases), it's the whole design flow.
CAD uses geometry primitives with parameters and exact sizing (e.g. you draw a rectangle of this size, and cut a whole into it this and this offset from one of the corners, and you expand this shape to 3D). As mentioned this can be approximated via geometry nodes, but they are very different in "ideology".
CAD modeller are good at producing parametric 3d models. You can make use of spreadsheets and constraints to create a piece, that will later super easily be changed.
> CAD modeller are good at producing parametric 3d models
If that's the only thing they do better than Blender, then it sounds like their days are numbered. Has to be more benefits right? Blender exposes a pretty wide Python API, loading spreadsheets ends up pretty simple, and together with Geometry Nodes, you can even visualize it in a way that makes somewhat sense. Constraints been existing for a long time in Blender too.
They are better at it on a fundamental level. It’s a completely different approach for data representation, offering precision and repeatability which is not possible with Blender's data model.
Blender may as well replace CAD apps in the hobbyist 3D printing space, but it will never replace them in the industry and professional work. Solid modeling CAD software commonly features more than just creating mathematically precise digital 3D objects, but also planning for CNC machining, FEM analysis, assembly and so on.
> It’s a completely different approach for data representation, offering precision and repeatability which is not possible with Blender's data model.
How exactly? And why not?
You need useful measurements/units, reproducibility, parameters, constraints, and I guess something more? As Blender can give you those things, it's not impossible in Blender. Want to have 3D objects automatically created based on values from CSVs together with constraints? Blender can already do that today, just as one example.
I don't really mind if Blender has a chance of replacing CAD apps or not, more curious about why exactly people find it so fundamentally impossible for Blender to be a useful alternative, and I have yet to hear any convincing arguments.
An analogy is the difference between vector and bitmap graphics.
CAD programs aren't just a different set of operations on the same data, they use an entirely different representation (b-rep [1] vs Blender's points, vertices, and polygons).
These representations are much more powerful but also much more complex to work with. You typically need a geometric kernel [2] to perform useful operations and even get renderable solids out of them.
So sure, I suppose you could build all of that into Blender. But it's the equivalent of building an entire new complex program into an existing one. It also raises major interoperation issues. These two representations do not easily convert back and forth.
So at that point, you basically have two very different programs in a trenchcoat. So far the ecosystem has evolved towards instead building two different tools that are masters of their respective domains. Perhaps because of the very different complexities inherent in each, perhaps because it makes the handover / conversion from one domain to the other explicit.
> CAD programs aren't just a different set of operations on the same data, they use an entirely different representation (b-rep [1] vs Blender's points, vertices, and polygons).
So with that in mind, there should be something that is possible to build in CAD, but impossible then to build in Blender?
I know the differences between the two, I understand they're fundamentally different, yet I seem to be able to produce similar results to others using CAD, so I'm curious what results I wouldn't be able to reproduce in Blender.
Sure. Create a diamond polygon and revolve it around a point.
Blender has methods and tools to _approximate_ doing this. It has a revolve tool... where the key parameter is the number of steps.
This is not a revolution, it's an approximation of a revolution with a bunch of planar parts.
BREP as I understand it allows you to describe the surfaces of this operation precisely and operate further on them (e.g. add a fillet to the top edge).
Ditto for things like circular holes in objects. With blender, you're fundamentally operating on a bunch of triangles. Fundamental and important solid operations must be approximated within that model.
BREP has a much richer set of primatives. This dramatically increases complexity but allows it to precisely model a much larger universe of solids.
(You can kinda rebuild functionality that geometric kernels have with geometry nodes now in blender. This is a lot of work and is not a great user interface compared to CAD programs)
I don’t have explanatory knowledge on the matter, sorry.
If you are interested you may look up the difference between solid, surface and mesh modeling. They all have strengths and weaknesses.
Ultimately you have to translate any model into a lossy representation/approximation due to discrete numerical control requirements and so on. However, the gist if it is, with mesh modeling this happens earlier in the design process. Even with procedural and parametric modeling in Blender, you will always encounter issues with approximation and floating point precision, which are inherent to the data representation.
For 3D printing that often doesn’t matter, because mesh approximation is precise enough. For hobbyists, CAD apps are kinda too niche and bothersome to be worth learning for simple models in 3D printing. The overall versatility of Blender and basic CAD-like capabilities are much more valuable and rewarding, in this space. In the end, you probably massively benefit from learning something like Blender anyway, because it’s much better suited for quickly conceptualizing an idea in 3D, than CAD. I think CAD works best, if the shape and specs of the object are already known. Organic shapes, clay-like deformations, which can’t be easily reduced to mathematical defined solid body functions, are something where Blender will always be better suited than CAD.
>Even with procedural and parametric modeling in Blender, you will always encounter issues with approximation and floating point precision, which are inherent to the data representation.
A common problem people run into with CAD models is importing a STEP file and modeling directly off of geometry in it. They later find out that some face they used as a reference was read by the CAD package as 89.99999994 degrees to another, and discover it's thrown the geometry of everything else in their model subtly off when things aren't lining up the way they should.
And that's with a file that has solid body representation! It's an entire new level of nightmare when you throw meshes into the mix.
The heart of any real CAD package is a geometry kernel[1]. There are really only a handful of them out there; Parasolid is used by a ton of 'big name' packages, for example. This is what takes a series of descriptions of geometry and turns it into clear, repeatable geometry. The power of this isn't just where geometry and dimensions are known. It's when the geometry and dimensions are critical to the function of whatever's being modeled. It's the very core of what these things do. Mesh modeling is fantastic for a lot of things, but it's a very different approach to creating geometry and just isn't a great fit for things like mechanical engineering.
> The power of this isn't just where geometry and dimensions are known. It's when the geometry and dimensions are critical to the function of whatever's being modeled.
Yes, but I meant making a case for workflow differences.
CAD is bad at aiding visual thinking and exploration, since you kinda have to be precise and constrain everything. You can pump out a rough idea of an object, edit it much, so much faster in Blender.
Sketching on paper, or visualizing in one’s mind, is pretty hard for most people when it comes to 3D. CAD is not at all inviting for creative impulses and flow. People who can do this in CAD are probably trained engineers who learned a very discipled, analytical way to approach problems, people who think in technical drawings.
So, CAD is good at getting a precise and workable digital representation of a "pre-designed" object for further (digital) processing, analysis, assembly and production. I think Blender is better at the early design process, figuring out shapes and relations.
In a vacuum for a standalone object, a 3D mesh app like Blender can be useful for brainstorming.
Most of my CAD usage is designing parts that have to fit together with other things. The fixed elements drive the rest of the design. A lot of the work is figuring out "how do I make these two things fit together and be able to move in the ways they need to."
There is still a lot of room for creativity. My workflow is basically "get the basic functionality down as big square blocks, then keep cutting away and refining until you have something that looks like a real product." My designs very rarely end up looking like what they started out as. But the process of getting them down in CAD is exactly what lets me figure out what's actually going to work.
It's a very different workflow, and it's definitely not freeform in the same way as a traditional mesh modeling app, but CAD is for when you have to have those constraints. You can always (and it's not an uncommon pattern) go back and use a mesh modeler to build the industrial design side of things on top once the mechanical modeling is done.
ETA:
I'd also add: I'm not sure "thinking in CAD" comes naturally to anyone; it's a skillset that has to be built.
If you try OpenScad-style adding and subtracting volumes, the syntax is pretty horrific. It is impossible to script objects that way. Quote Gemini:
However, implementing a full OpenSCAD-like syntax and robust CSG system from scratch in Blender Python is complex due to Blender's mesh-based nature versus OpenSCAD's mathematical description. Blender's boolean operations on complex meshes can sometimes lead to topological errors.
To be fair though, OpenSCAD works best too if you do this during the generative step and not after the fact. I've used it to remix existing STLs so it definitely does work but you really have to watch the areas where two shapes get close to each other, especially if there is a lot of fine detail.
Nice site! I have a suggestion for a prompt that I could never get to work properly. It's been a while since I tried it, and the models have probably improved enough that it should be possible now.
A knight with a sword in hand stands with his back to us, facing down an army. He holds his shield above his head to protect himself from the rain of arrows shot by archers visible in the rear.
I was surprised at how badly the models performed. It's a fairly iconic scene, and there's more than enough training data.