remember, before there was recorded music all music that could be heard was performed live. The technology has evolved over time. Once people were buying and using recorded music, there was a tendency for less creative participation. Some things have emerged in the face of this. For instance, karaoke, where people can participate live with the music. Also, things that DJs did. Scratching literally turns a record, the embodiment of mass media music, into a musical instrument being wielded by someone, often in a small group setting. When the raw material is traditional instruments, people can express themselves in certain ways. DJs remix, add breaks to, scratch over, rap over, and otherwise express themselves with the records and turntables as the raw material.
This may be a really dumb question, but is that much of the behavior of an x86_64 CPU variable and undefined? Until recently I thought the chipmakers provided full information (recently I found an article about people investigating the undocumented innards of the 286, IIRC). This seems like a pretty shaky foundation for software.
Documentation is definitely not one of x86's strengths. Other architectures do much better. For example, ARM provides formal models of their CPUs, and RISC-V is so simple you could implement all its semantics in a few thousand lines of code.
There are quite a few instructions with undefined behavior, but it is not that much of an issue if you can choose to avoid it -- for example in a compiler.
Almost all UB is found in flags or when using invalid instruction prefixes.
And although there is some unexpected UB, like `imul`'s zero flag being UB instead of being set according to the result of the multiplication [1], reading the manual and sticking to the parts that are clearly not UB gets you most of the way.
However, it becomes an issue if you need to analyze a binary that uses UB.
Then you can't choose which instructions to use, so you need to have a complete model of all UB.
That's much more difficult, and for example most decompilers currently fail at this.
We have an example of this in Figure 1 of our paper.
I've used DB2, Sybase, Postgres, mysql, Oracle, and a little SQL Server. Long ago I worked in a Sybase shop and management said we must move to Oracle. Sybase had a nice programming language that made things easier and clearer than Oracle, all i remember is that it involved creating temp tables and then querying against them. I think Sybase is no longer very popular but SQL Server was based on it and may have similar stuff.
Postgres is pretty ubiquitous and it is pretty good IMHO. The DBAs at a recent job all said SQL Server is better but there is the $/windows server stuff you have to do to use it.
I worked for a while as a contractor for the US Dept. of Education Student Loan system. It was z/OS with DB2 and most new business logic was done in this weird language "Gen" https://www.broadcom.com/products/mainframe/application-deve... . Gen can supposedly "generate" java and other stuff but they used it to generate COBOL for the mainframe. You could debug the Gen code on the 3270 emulator, rather than trying to deal with the generated COBOL. There were a small number of people (like 6) who were handling that code. The data and I guess some of the code went back to like 1980 at least. There was so much legacy code, I doubt they've changed platforms. I was supposed to be more a Java guy but I did a little Gen. Mainframe is very alien to me. The people that knew it well could really crank on it, though. I joined when they were converting an MS ASP front end to a Java one. So we wrote Java that users interacted with via the web and that made calls to Gen (really, to cobol). In retrospect there was a lot wrong with that operation... One interesting problem that came up once was that the mainframe didn't sort things the same as Java. It turned to be caused by EBCDIC vs UTF.
thanks for sharing that, it's super entertaining to consider what crazy things people might be doing in the future. Debugging EBCDIC was a surprise and got me laughing.
Thanks for all the replies! I'm going to give Arq a try. EASEUS was unable to open a Bitlocker-encrypted external drive from its restore media(USB). I guess I shouldn't have expected it to be able to. I have not lost any data, I was just testing my restore capability. I will see if Arq works for me. (and I am reformatting the external drive, will hopefully have encryption of the backup on the unencrypted drive).
I've done it twice. I dreamed up the line, "I wanted to be able to concentrate on my job search rather than doing it on the down low while working". Maybe that'll be useful. I got jobs both times after quitting. But yes, you need finances to fall back on. It'd be best to keep your health insurance, if you're in the USA.
I have a 2014 MBP Retina running Arch and it's now pretty good. It took some tweaking. It does overheat when I try and run HDMI 4K full screen video though. It's not equal to that. It got so hot it turned off.
Am I a Luddite? I just feel like I don't trust AI because of the hallucinations. It also sort of offends my sensibilities to ask an AI to rewrite something I wrote, and more so the idea that I am receiving things that have gone through that process. I was amazed when I first started checking out AI last year, and it is a great technical achievement. But I don't like the idea that it gets things wrong and so I don't trust it. I've tried asking ChatGPT (admittedly I haven't paid for 4.0) things like summarize a book I know well and it is often wrong.
Copilot in VSCode is really something though. Even if it makes mistakes.
I've been thinking about this myself. It'd be great if some of the essential crates could be somehow sponsored or certified. Go does provide a lot more capability in the standard lib.