Oddly, I’m conflicted on Flutter so far, but I have loved working in Dart.
So much so that I ended up writing a queueing app for scheduling batches of sequential tasks on the server in Dart just to see how it could work as a NodeJS replacement, and thought the whole dev experience was great.
I certainly did. In fact, the opening paragraphs of this piece immediately brought me back to my own state of mind when my Mom was diagnosed with cancer.
There was at least a year long period in which my thoughts darted and weaved wildly, with every mix of emotion, all at once.
“I need to finish this bug fix. But first I should get some coffee. That coffee in the hospital was so warm and comforting, in that styrofoam cup. Just what I needed in the waiting room… which is when the doctor told me her prognosis.
“Six months, he said. F*k. How can I do this? I need lots of coffee. But coffee is reminding me of bad things. How will I ever drink coffee again? Would be a shame to never drink coffee, though… it’s a big industry after all. Wonder what it looks
Like to pick coffee beans? Bet it would be nice to just be picking coffee beans without any other care. But I have my own job to do… that bug fix. I’ll do that instead.”
Random thoughts of work, grief, jokes, and childlike daydreaming, all running together. All day. Every day.
The author of this captured this feeling insanely well, whether that was intended or not.
I can also relate in the sense that, that period of my life was perhaps one of the more intense periods of self improvement and introspection I’ve had.
Something about having so many thoughts, and needing to channel them to something positive to overcome the blatant and glaring negative, led to growth as a software developer, in some cruel way.
That aside, the rest of the piece is timely and relevant for me now.
I feel like there’s so much I can relate to regarding “resistance” and self doubt. Of casting aside bad criticisms from incapable critics as the author described from her MFA experience.
My heart is with the author through all of this. I hope to follow more of her work.
Can you share how you got through this period and found alignment? I’m going through something similar to what you’ve described. Not the hospital situation—I’m sorry to hear about your mom—but more so the thoughts darting rapidly on their own. I can’t seem to get ahold of them either, and I notice it getting worse. Lots of intrusive thoughts, lots of “open cycles” that cause me mental strain, lots of down cycles too. If you could share, I’m curious how you channeled it into something positive and grew* as a result.
In my case, it was almost out of existential need. I could see myself falling apart to the point of not being functional or even doing something to myself, and I knew that my parents were depending on me.
So out of existential need, I intentionally starting taking on large, creative projects at work that I knew would hold my interest and consume my thoughts. In some cases, this meant undertaking projects of my own volition and "asking for forgiveness rather than permission" at work.
In part because of a couple of articles I read on the scientifically shown improvement of outcomes of cancer patients with positive attitudes, and because I knew my mom already had several negative voices around her daily, I decided my role with her would be relentlessly positive.
An attitude of "we don't know the future, all things are possible, and anything can be overcome with the right set of inputs -- we just need to find what those are". I quickly adopted this attitude for myself, and it allowed me to embrace failure more - because the attitude wasn't predicated on being the best, but rather of overcoming.
Granted, this was all about 6 years ago. Since then, much has changed, and I do find myself facing similar issues again. Without the presence of something "existential" pushing me, I am finding it harder to overcome this time myself.
As with most things, though, feedback cycles are a thing. Negativity feeds on itself, and success begets success, so the first step is finding whatever you can to help break the feedback loop. Catch any negative thoughts as quickly as you can, and redirect them from fatalistic into something malleable.
Catch any random, distracting "I need to Google this" type thoughts as they happen, and write them down on a notebook as something you should Google later, but not right now.
One important thing at the start is that, you don't have to necessarily believe every positive mantra or habit you say, you just have to do it. Over time, the believability will come on its own.
If you can get momentum going towards the positive instead of the negative, break the feedback loop, and get onto the "success begets success" side of it, it gets much easier.
Hope that helps and makes sense. Wish I had an actual, easy answer, but a lot of it is just trying things until you see what works, and being consistent above all else.
Good luck, and if you come up with any of your own tips, please let me know, because as I said - for as much as I've been through this before successfully, I can see it happening again, and I'm realizing it's time to deal with it again myself.
Thanks for the response. Really appreciate it. This is really helpful.
The existential need you mentioned is really powerful. Now that you mention it, the last time I felt really mentally aligned, well, and focused was when I was out of work. I also had a situation where people were depending on me, and it…it wasn’t perfect but it really filtered out a lot of these other thoughts and impulses. Maybe there’s something there about a goal that exists beyond ourselves. Good callout, I’d totally forgotten about that.
I hear you on the consistency. I’m trying that myself too. Just committing to a few actions even if my brain is completely working against me. Again, mixed results, but I’m finding that something is better than nothing, and that, like you said, success begets success.
Slowly but surely, I’m beginning to learn that whenever a US government agency has a problem with something, the problem isn’t that they are genuinely concerned about the thing they say they are, the problem is they aren’t the ones benefiting from it.
I’ve often wondered of the feasibility of some kind of special built LED monitor with a custom chip to handle real-time (within a frame or two) scan line generation and CRT mask / phosphor glow emulation built in, with a slightly curved glass in front of it to complete the illusion.
That way, to my uneducated mind, the monitor itself would be handling the CRT shader effects and could accept any input, including real 80s/90s/00s devices.
But it’s quite likely that I’m underestimating what it would take to have a dedicated input processor in a display that works as well as the GPU shaders do, and underestimating the minimum amount of input lag required.
You could certainly integrate a shader into a monitor using a sufficiently powerful GPU or equivalent. But looking at https://i0.wp.com/wackoid.com/wp-content/uploads/2021/08/CRT... for example, you'd need at least 10 times the original resolution, which for NTSC would translate to roughly the same vertical resolution as an 8K display, if not more.
The curved glass wouldn't be necessary, nor very desirable, IMO. Trinitron TVs were approximatly flat and great for CRT gaming.
With no other context, I would not suggest stock Ubuntu.
It’s had incremental improvements, but as a desktop OS, it’s a bit rough in my recent (within the last 1.5 years) experience.
If there are specific issues you have with Fedora, and/or you simply want something Debian based, I would suggest a stable Ubuntu derivative like pop_os! or Zorin OS.
Seeking recommendations is great, but as a desktop Linux user myself, I don’t think there’s any better substitute for having a spare “play” computer (even a junker or a cheap mini PC) that you can just try a few distros out without affecting your main PC until you find your match.
I tend to argue that “slippery slope” is not actually a logical fallacy the way others (e.g. straw man) are.
A slippery slope is often a legitimate concern.
Using it as the sole means to shut down an idea is often disingenuous, but so, too, is shutting down any concerns of a ”snowball effect” by calling it a logical fallacy.
Like any logical fallacy, the slippery slope is about the general validity of a logical inference - and not about the truth of its conclusion.
The slippery slope fallacy argues that "if X happens, then eventually Y will happen as well", where Y is a more extreme version of X. This is not a valid logical inference.
That doesn't mean that there are never cases where X actually leads to Y. Just as calling out "appeal to authority" doesn't mean that an expert isn't often right, or just as "correlation is not causation" doesn't imply that correlation is never causation.
I bought one with an AMD 4800U in early 2022 to be used as a cheap, dedicated Linux device for work (mostly Remote Desktop plus some local docker development environments) and it’s been rock solid for me.
Your mileage may vary, but I’ve been happy enough with mine for the price that I had already decided I would be choosing Beelink for my next mini PC over a NUC.
That said, I have seen some mixed reports of some issues around thermal throttling and eGPU support for some of the newer gaming focused ones, but I think if you have realistic expectations given the form factor (and especially if you’re not going for gaming) then they are fairly reliable and sturdy little devices.
* Also note, I bought mine barebones and added my own ram and SSD. I can’t speak to the quality of what they ship with, but I also haven’t heard any complaints from others with that regard.
One of the many paradoxes that I’ve found (but never figured a reason for) is why I’m able to quickly memorize and find my way to common functions via text-based toolbar menus, but to this day, I STILL have to click through each ribbon menu multiple times, study each icon and struggle to read each label, before finding what I want.
Logically one would think icons and visually distinctly colored ribbon tabs would be better, but (at least for me) they are decidedly worse.
I think this has to do with the varying designs of each ribbon. Menus and their submenus are more like an index, closely following a particular pattern. Ribbons are more like grocery store layouts with variations that shift around and sometimes seem to not follow any rhyme or reason. It's not too surprising that the former of the two is more easily memorized.
Motion patterns in toolbars are strict: you have to start at the top and drill down. This seems like a hindrance, but it means that the motions develop into stronger muscle memory. If you know the names of what you are looking for, you can usually develop the entire shortcut pattern through everyday use, without setting aside practice time(as in a setup like vim or emacs, where you aren't given sufficient prompting to discover and train new interactions automatically).
Ribbons surface more elements to browse in a freeform context, which is correct if you need to discover features...but also conflicts with the goal of a toolbar to be a thin layer over the shortcuts.
One possibility is Mnemonics. You were memorizing the important letters in a text-based menu. Possibly even the keyboard shortcut mnemonics themselves. That's said to be one of the biggest losses in Windows user experience that keyboard mnemonics used to be highlighted at all times with an underline in text menus and then Windows UX switched to only highlighting them when Alt was pressed.
It's something I think about a lot with the Ribbon because it has some really good keyboard mnemonics in Office applications, but mostly only Power Users think to press the Alt button to let them "bubble in" on the Ribbon. The keyboard mnemonic bubbles make great landmarks, and I think that remains one of the reasons I rather like the Ribbon (as a power user) that a lot of people never discover. (In part because I was there a million years ago when Word first lost the underlines and was used to even then pressing Alt on its own just to see them so that behavior carried over to the Ribbon just fine for me, luckily enough.)
One reason could be that the ribbon resizes/hides/collapses buttons depending on the size of the Window. I resizes my word/excel window as I work and it's an ordeal to find the option I want.
This and the hidden ribbons completely ruin the thing for me. But I do tend to like megamenus on other applications, so the problem is probably office, not the ribbons.
Toolbars have text labels. Ribbons have a bunch of small shitty icons on a flat UI background. I can memorize toolbars because it's a set of motions, words, and visual cues. With a Ribbon UI it's a mad search for what I want, hovering over dumb icons to see a label, and repeating that process until I find something. The damn search and rescue process totally blows away my working memory and I won't remember where the button is next time I need it.
The senior developer above me is like this for me, and it’s rough because it shakes my self esteem AND often leads to me having to support and extend a fragile and changing 3rd party library to solve a rather specific problem that would be better solved with custom code.
Where it really stings is when I do something of my own initiative (like, for example, create a transparent API over memoizing and caching some expensive calculation, or refactor some of our common client customizations into their own set of classes so it’s easier to extend) and he ignores it or scoffs at it.
Only to then find a third party library that implements something similarly. THEN it’s presented to me as a “brilliant idea” that we can take advantage of.
At that point, when I say “we” already do this, he usually rephrases what his brilliant 3rd party library does, as though he can’t fathom that I would be capable of doing something like that myself, and clearly I’m just not understanding what he’s telling me.
I think it’s a defensive mechanism for his ego against one of his peers or employees being more capable than he is.
But it’s also not like he can’t learn this stuff himself, he just doesn’t ever put in the time or effort.
With many exceptions (the left-pad debacle comes to mind), it’s generally much better to use a third party library instead of supporting your own implementation.
I know this is the mantra, but my experience has been highly mixed and only generalizable to how low in the stack it sits.
For, say, security implementations for authorization and access controls, or even low level HTTP request routing? Absolutely. The goal there is to adhere to something standard and battle tested by experts, and the third party libraries tend to be fewer in number, and of higher quality, with longer term support and clearly defined upgrade paths.
But that’s the lower level stuff, where your special custom needs are superseded by the primary goal of just “doing the one right thing”, or “adhere to the commonly agreed upon standard”.
For all the other things that make an app unique - things like CSS frameworks, UI components (beyond basic, accessibility-minded building blocks), chart drawing, report generation and caching - my experience has taught me otherwise, the hard way.
Being stuck using a 3rd party library that doesn’t do what the client or business needs it to do, having to juggle our own internal patches and bug fixes with updates to the library itself, all only to have the library abandoned or deprecated in favor of the author’s next pet project, really sucks and often comes with a high opportunity cost and a high development cost.
I now consider third party implementations of higher level features (and especially anything front-end) to be something that needs to be evaluated as equally costly as an internal implementation by default, and not favored just because somebody else wrote it.
Maybe I’ve just been unlucky in my experience, though. I also suspect ecosystem makes a difference. The PHP and JS ecosystems are full of poor libraries with snake oil sales pitches. I suspect this is different with, say, Rust.
I think we mostly agree with each other. As I said, there’s many exceptions.
I’ve mostly worked with Python and JVM languages, which probably explains why I’m less passionate about the counter-argument than you are. Ecosystem definitely matters a lot. VanillaJS is the only good JS framework IMO.
I’ve noticed this as well - they don’t view software building as a form of engineering that can be learned from basic principles of computer science, but instead as magic.
So the go to for every solution is to find a third party library from a “real” witch or wizard, and follow its basic tutorial. Maybe try to customize it a bit at most.
If something breaks, just start randomly moving things around or copying and pasting more code from forums until it works.
I can’t live like that. I need to know why something’s not working, AND why it IS working. I like stepping through my code with a debugger just to make sure things look right, even when they’re working.
I think the craziest part, though, is just how much people with this “software is magic” mindset can actually get just by brute force cobbling things together.
So much so that I ended up writing a queueing app for scheduling batches of sequential tasks on the server in Dart just to see how it could work as a NodeJS replacement, and thought the whole dev experience was great.